The newest buzzwords thrown around in cyberspace are “AI” and “Artificial Intelligence”. This year has seen a huge increase in the usage of AI-based technology thanks in part to the ChatGPT going mainstream and now powering a portion of Bing, a search engine from Microsoft.
It’s no more news that Google is also trying to implement its own Bard AI into its popular search engine but then we have Amazon whose cloud computing subsidiary released a suite of technologies on Thursday in an attempt to help other companies develop their own chatbots and image-generation services back by AI of course.
The generative prowess of the current version of AIs circulating the internet is quite efficient in the task and can do whatever you ask them to do.
Take Google’s Bard as an example, the company said it would integrate AI technology into its Business suite of apps including Gmail. This can help you write email messages and reply to messages in whatever tone you’d like it to.
- Advertisement -
ChatGPT has been incorporated into multiple applications and digital products making it the most popular Generative AI out there.
Amazon Web Service, on the other hand, is the largest cloud computing provider and the company isn’t holding back in the AI battle as well even though it’s taking a different approach.
AWS’ own AI will be called Bedrock and it will allow businesses to customize “foundation models” which are the core AI technologies that do things such as responding to queries with human-like text or generating images from a prompt.
ChatGPT creator OpenAI, for example, offers a similar service, letting customers fine-tune the models behind ChatGPT to create a custom chatbot.
Those using Bedrock will be able to use Amazon’s proprietary foundation model called Amazon Titan which will offer a menu of models that will be offered by other companies.
The first third-party options will come from startups AI21 Labs, Anthropic, and Stability AI alongside Amazon’s own models.
The Bedrock service lets AWS customers test-drive those technologies without dealing with the underlying data center servers that power them.
“It’s unneeded complexity from the perspective of the user,” Vasi Philomin, vice president of generative AI at AWS, told Reuters. “Behind the scenes, we can abstract that away.”
To help the system perform efficiently, it will make use of both Amazon’s own custom-made AI chips and those from NVIDIA which is the largest known supplier of AI chips.
“We’re able to land tens of thousands, hundreds of thousands of these chips, as we need them,” Dave Brown, vice president of Elastic Compute Cloud at AWS, said of the company’s custom chips. “It is a release valve for some of the supply-chain concerns that I think folks are worried about.”
There seems to be no stop or any form of slowing down regarding AI development and implementation in the tech world currently and that is a reality that everyone needs to embrace.