Adobe used its Summit in Las Vegas to announce Adobe Sensei Genai, a set of generative artificial intelligence services that will be included in many of their products.
The services are referred to as a "co-pilot" by Adobe. It said that multiple LLMs would be used, including one from Microsoft and another from Flan-T5. Business needs will dictate the models used.
There is a service called Adobe Firefly. Users can use their own words to create new images, audio, video, and 3D with Firefly.
Firefly will be integrated into Adobe Creative Cloud, Document Cloud, Experience Cloud, and Adobe Express, with the first integrations being Adobe Express, Experience Manager, andPhotoshop. High-quality images and text effects are only available in the first release.
Hundreds of millions of assets, openly licensed content, and public domain content are all included in the first model. Other assets, technology, and training data will be used in future models.
For creators who don't want their work used in training artificial intelligence models, a new "Do Not Train" tag can be applied, which will stay with the content where it's used, stored, or published The compensation model for Adobe Stock contributors won't be released until Firefly is out of trial.
For companies that want to make sure their content is on brand, there will be the ability to train Firefly to follow brand guidelines.
Content can be used for commercial purposes. Firefly will eventually be made available to other platforms through an application programming interface.
Enhancing Experience Manager
Firefly will be integrated into Adobe AEM Assets. Teams can instantly change image components and generate asset variations for different channels with this direct integration.
That is just the beginning. Companies will be able to create a language model that trains on customer data and content with the help of Adobe products.
Adobe Journey Optimizer can be used to create message variations. They will be able to change the tone of voice or the key words in the copy. Creating and managing website copy is the same thing.
Market Engage is positioned as using Genai to power Dynamic chat, and Adobe Journey Analytics will get text-based descriptions for key data points.
Adobe Sensei can be used to help marketers understand how content performs at the Attribution level. They learned that east coast women respond best to orange color tones and casual voices.
The ability to understand how content is performing at this level would lead to more personalized campaigns and experiences. Today's personalized experiences are often too high level to ensure accuracy and effectiveness.
Content creators can now create, edit, and publish their content in a variety of formats. Content templates are used and security is applied to make sure only the right people can publish, but this is an interesting capability that should make publishing easier. It stands to reason that we should see it come into play inside Word and Google Docs as they continue to grow their built in generative artificial intelligence capabilities.
The new product from Adobe is called Adobe Express for enterprise. Express is a tool designed for non-designers. The enterprise version uses Firefly.
Supporting the content supply chain
If you are in marketing, you know the work involved in building campaigns and experiences, including everything from developing images, videos, new copy, and other content and ensuring it all comes together correctly to meet deadlines. It gets complicated to manage multiple teams performing this work.
The Creative Cloud for enterprise, Adobe Workfront, AEM, Express for enterprise, and Frame.io are part of the Content Support Chain.
It's integration between the tools to make the process easier. Designers can see the work assigned to them in the Creative Cloud and submit their work for review without leaving their design tools, thanks to a WorkfrontPlugin. Adobe wants to make it easier to create variations and speed up testing.
It's a good idea to have a tool that helps content creators. It scared a lot of creators, but that is another story. The impact that demand has on companies is discussed by Adobe. Meeting the demands of smaller teams and tighter budgets puts a lot of stress on workers.
Content that supports what audiences and customers want is needed. It is a challenge to figure out what the content is. Variations that are tested and adapted until the right content is there are created by Generative Artificial Intelligence.
Adobe is on track to create and measure content. A lot of talk right now, with only a portion of what has been announced available, but that seems to be the norm these days, to announce before it's actually there. It probably won't be that long of a wait at the moment.