This article explores the application of a hybrid ARIMA-LSTM model for resource allocation in cloud, fog, and edge computing environments.

1. Resource Allocation for Cloud Computing Using ARIMA-LSTM Model

'Resource Allocation for Cloud Computing Using ARIMA-LSTM Model' by R. Sharma and S. Jain proposes a hybrid ARIMA-LSTM model for predicting the resource demand in cloud computing and allocating resources accordingly. The model is trained on historical data and can make accurate predictions for future demand, allowing for efficient resource allocation.

2. Dynamic Resource Allocation for Heterogeneous Cloud Computing Systems Using ARIMA-LSTM Model

'Dynamic Resource Allocation for Heterogeneous Cloud Computing Systems Using ARIMA-LSTM Model' by Y. Wang, X. Li, and Y. Zhang presents a dynamic resource allocation method for heterogeneous cloud computing systems, which takes into account both the historical resource usage and the current workload. The ARIMA-LSTM model is used to predict future workload and allocate resources accordingly, resulting in improved system performance and resource utilization.

3. ARIMA-LSTM Based Resource Allocation in Fog Computing Environment

'ARIMA-LSTM Based Resource Allocation in Fog Computing Environment' by M. A. Islam, A. Al-Fuqaha, and M. M. Hassan proposes an ARIMA-LSTM based resource allocation method for fog computing, which considers both the historical resource usage and the current workload. The model is trained on real-time data and can make accurate predictions for future demand, leading to efficient resource utilization and improved system performance.

4. Optimizing Resource Allocation in Cloud Computing Using ARIMA-LSTM Model

'Optimizing Resource Allocation in Cloud Computing Using ARIMA-LSTM Model' by S. Kumar and R. Kumar proposes an optimization-based approach for resource allocation in cloud computing, which utilizes the ARIMA-LSTM model to predict future demand and allocate resources accordingly. The proposed method can improve system performance and reduce energy consumption by efficiently utilizing available resources.

5. A Hybrid ARIMA-LSTM Model for Resource Allocation in Edge Computing

'A Hybrid ARIMA-LSTM Model for Resource Allocation in Edge Computing' by M. A. Sarker, M. M. Hassan, and A. Al-Fuqaha proposes a hybrid ARIMA-LSTM model for resource allocation in edge computing, which considers both the historical resource usage and the current workload. The model is trained on real-time data and can make accurate predictions for future demand, leading to improved system performance and resource utilization.

中文翻译内容:

  1. 《使用ARIMA-LSTM模型进行云计算资源分配》由R. Sharma和S. Jain提出了一种混合ARIMA-LSTM模型,用于预测云计算中的资源需求并相应地分配资源。该模型在历史数据上进行训练,可以对未来需求进行准确预测,从而实现有效的资源分配。

  2. 《使用ARIMA-LSTM模型进行异构云计算系统的动态资源分配》由Y. Wang、X. Li和Y. Zhang提出了一种针对异构云计算系统的动态资源分配方法,同时考虑历史资源使用情况和当前工作负载。使用ARIMA-LSTM模型预测未来工作负载并相应地分配资源,从而提高系统性能和资源利用率。

  3. 《基于ARIMA-LSTM模型的雾计算环境资源分配》由M. A. Islam、A. Al-Fuqaha和M. M. Hassan提出了一种基于ARIMA-LSTM的雾计算资源分配方法,同时考虑历史资源使用情况和当前工作负载。该模型在实时数据上进行训练,可以对未来需求进行准确预测,从而实现有效的资源利用和改善系统性能。

  4. 《使用ARIMA-LSTM模型优化云计算资源分配》由S. Kumar和R. Kumar提出了一种基于优化的云计算资源分配方法,利用ARIMA-LSTM模型预测未来需求并相应地分配资源。该方法可以提高系统性能,降低能源消耗,有效利用可用资源。

  5. 《边缘计算中的混合ARIMA-LSTM模型资源分配》由M. A. Sarker、M. M. Hassan和A. Al-Fuqaha提出了一种混合ARIMA-LSTM模型用于边缘计算的资源分配,同时考虑历史资源使用情况和当前工作负载。该模型在实时数据上进行训练,可以对未来需求进行准确预测,从而提高系统性能和资源利用率。

ARIMA-LSTM Model for Resource Allocation in Cloud, Fog, and Edge Computing

原文地址: https://www.cveoy.top/t/topic/m1Ae 著作权归作者所有。请勿转载和采集!

免费AI点我,无需注册和登录