Cloudnone interviews Mr. Wattasan Teerapatthapong, the Country Manager of AWS Thailand, about the future of Generative AI in 2024 through the lens of the world’s largest cloud service provider.
In the current interview, Mr. Wattasan reflects on the market outlook for Generative AI, transitioning from experimental to practical usage. Some Thai business customers of AWS have already begun using the Amazon Bedrock service.
On the Amazon side, the approach to Generative AI differs from other competitors by viewing Bedrock as an open service where customers can “choose” their own Generative AI models. These models can range from Amazon’s self-developed Titan lineage models, to models such as Claude from Anthropic, and other open-source models like Llama and Mistral.
Amazon’s concept of “creating choices” also extends to infrastructure and running machines, offering a variety of options such as traditional x86 CPUs, cost-saving Arm Graviton CPUs, powerful NVIDIA GPUs for heavy workloads, and their designed Inferentia/Tranium chips.
Furthermore, the Amazon Bedrock service excels in its integration with other AWS services commonly used by organizational customers, such as S3, EC2, and Lambda. This seamless integration eliminates the need to transfer data back and forth, including tools to easily link organization data with Agents.
Mr. Wattasan elaborates on all these details comprehensively in the current interview.
Additionally, AWS is set to host the AWS Summit Bangkok on May 30th at the Sirikit National Convention Center. For more information on the agenda and registration, visit AWS Summit Bangkok.
TLDR: Cloudnone interviews AWS Thailand’s Country Manager on the future of Generative AI in 2024, highlighting Amazon’s unique approach to Bedrock as an open service with diverse model options and strong integration with other AWS services. Attend the upcoming AWS Summit Bangkok on May 30th for more insights.
Leave a Comment