Amazon Internet Companies (AWS) introduced on Nov. 28 the launch of ‘Q,’ an AI-powered digital assistant and chatbot to reinforce office productiveness.
The chatbot, able to partaking in conversations, problem-solving, and producing content material and insights, can entry firm data to offer tailor-made responses. AWS plans to supply user-based plans with personalized options and charges, emphasizing its dedication to privateness and safety by not utilizing content material from Amazon Q customers to coach the service’s AI mannequin.
Q is predicted to be obtainable by way of Amazon Join, the corporate’s customer support product, and Quicksight, its enterprise intelligence instrument.
In the identical announcement, AWS revealed it is able to compete with different work-oriented AI merchandise reminiscent of Google Assistant with Bard and Microsoft 365 Copilot. There are additionally studies that AWS is growing a separate AI search instrument for retail customers.
Different AWS updates
In different AI-related developments, AWS introduced the manufacturing of two new chip households designed for AI purposes – Graviton4 and Trainium2.
These chips, obtainable by way of the Amazon Elastic Compute Cloud (Amazon EC2), promise improved value efficiency and power effectivity. They’re appropriate for varied machine studying and AI purposes and stand alongside chips from different companies reminiscent of AMD, Intel, and Nvidia.
A various vary of corporations, together with AI agency Anthropic, software program agency Databricks, safety agency Datadog, gaming platform Epic, observability platform Honeycomb, and multinational software program agency SAP, are reportedly utilizing these chips.
The timing of those developments is essential as AWS’ continued chip manufacturing will assist compensate for current Nvidia’s AI chip shortages. These improvements emphasize AWS’s aggressive push into the increasing AI panorama.
The publish Amazon Internet Companies introduces AI chatbot ‘Q’ and two new AI chips appeared first on CryptoSlate.