Last week was a special one for Adobe (Nasdaq:ADBE) as thousands of professionals and business representatives flew in from around the globe to learn, grow, improve and network at the Adobe Summit. It being the first in-person summit in four years, post pandemic, there is a lot to share from the numerous informative sessions that spanned a 3-day period from March 21-23, including some unexpected animal rights protesters from PETA crashing the Keynote.
As businesses around the world, big and small, have turned a large part of their focus toward digital transformation over the last few years, companies such as Adobe, along with many partner companies, continue to play a large role in the space; and many individuals came to the summit looking for inspiration and ways to level up within their own business processes.
Adobe’s focus is to “change the world through digital experiences” and the company has put much effort into AI innovation for over a decade. During the summit last week, it announced its new generative AI offerings, which included Adobe Firefly, the first model being trained on Adobe Stock images. The announcement shared that Firefly promises to bring more precision, speed, power and efficiency into Document Cloud, Creative Cloud, Experience Cloud and Adobe Express workflows. Adobe Firefly is currently in public beta.
David Wadhwani, president of digital media business at Adobe stated, “Generative AI is the next evolution of AI-driven creativity and productivity, transforming the conversation between creator and computer into something more natural, intuitive and powerful. With Firefly, Adobe will bring generative AI-powered ‘creative ingredients’ directly into customers’ workflows, increasing productivity and creative expression for all creators from high-end creative professionals to the long tail of the creator economy.”
Also in Adobe’s AI announcements last week was Sensei GenAI. As a new generative AI co-pilot in Adobe Experience Cloud, it will assist CX teams and marketers with productivity without adding to their workloads. Adobe says teams will be able to move seamlessly between Sensei GenAI services and existing features, directly in their workflows for asset creation, planning, personalization, customer journey management etc.
When it comes to the overall abilities of generative AI and what the future has in store, several speakers had insights to share during the summit. Dave Ricks, CEO of Eli Lilly said that while it was helpful in many aspects, including the marketing sphere, it can also aid chemists by speeding up the process of discovering new molecules, among other critical use cases.
Additionally, Adobe shared the rollout of its next-generation Adobe Experience Manager (AEM), and its latest innovations promise to enable teams to easily update company mobile apps and websites by using popular word processing or spreadsheet tools, speeding up enterprise content management. Next-generation AEM leverages Adobe Sensei AI to evaluate content performance.
Marcus East, Executive VP and Chief Digital Officer for T-Mobile shared that the technologies offered by Adobe would allow T-Mobile to advance in ways that were not possible before. “We knew that we needed to work with the best in the industry and that’s why we chose to work with Adobe,” said East.
These were just a few of the announcements from Adobe last week, and a number of the informative and product sessions including the Keynote, Sneaks and Super Sessions are available to watch on replay all-year, with more sessions becoming available for online viewing on March 30th.