Generative AI Solutions Corp. announced the creation of MAI Cloud(TM), its Artificial Intelligence ("AI") cloud service that will provide High-Performance Computing ("HPC") power for AI computing requirements, including rapid processing power, large-scale storage, and state-of-the-art software applications, all delivered via the cloud over the internet on a pay-as-you-go basis. The Company has incorporated a wholly owned subsidiary, MAI Cloud Solutions Inc., through which the MAI Cloud(TM) platform will be launched.

MAI Cloud Solutions Inc. intends to utilize NVIDIA A100's and H100's to power the MAI Cloud(TM) platform and is currently seeking hosting arrangements with third parties. The NVIDIA A100s and H100s are specialized graphics processing units ("GPU") designed for AI computing and deep learning tasks. One of the notable features of the A100 and H100 GPUs are their tensor core technologies.

Tensor cores are hardware accelerators designed to efficiently perform matrix operations, which are fundamental to AI computations, such as matrix multiplications and convolutions. Tensor cores enable faster training and inference times for deep neural networks, improving the overall efficiency of AI workloads. MAI Cloud(TM) will enable its customers to access and utilize resources without the need for physical infrastructure or hardware on their premises.

AI algorithms require substantial computational power and large datasets for training and inference, which can be efficiently hosted and managed in the cloud. MAI Cloud(TM) will provide scalable and on-demand computing resources, allowing organizations to easily scale up or down based on their AI needs, enabling faster and more efficient training and inference processes.