See how we used generative AI technology to speed up the design workflow and incorporate an illustrative style in the making of a new poster design
AI tools and technologies are on the cusp of changing our everyday lives. AI is already creeping up in the applications we use to do our jobs every day. From AI-generated code built directly into GitHub through Github Copilot, to AI-generated content from ChatGPT (that is helping me write this article), to AI-enabled technologies now being released into product suites, like the Adobe Creative suite, to enable designers and artists to perform tasks in seconds that could have taken hours.
Our team recently took a trip to LA to attend Adobe Max, and AI was one of the most prevalent topics throughout the conference. It was interesting how AI-driven productivity tools were being interwoven into Adobe’s products, while messaging was carefully crafted to emphasize AI tools are designed to augment human creations, not replace them.
How AI tools will change the design process
3 primary ways that AI tools are changing the design process are through:
- AI-powered design tools: AI-powered design tools can assist with various aspects of the creative design process, such as generating ideas, prototyping, and even creating finished designs. These tools can use machine learning algorithms to analyze existing design patterns and trends, and then suggest new ideas based on this analysis. This can help designers to come up with fresh, innovative concepts more quickly and efficiently.
- Automation of routine tasks: AI can automate routine tasks in the design process, such as resizing graphics or formatting text. This can free up designers to focus on more creative and strategic tasks, rather than getting bogged down in tedious, time-consuming tasks.
- Personalization and customization: AI can create personalized and customized designs for individual users or customers. For example, an AI-powered design tool could analyze a user’s preferences and history, then generate a unique design tailored to their needs and preferences. This could lead to more effective and engaging designs, which are more closely aligned with the individual user’s interests and tastes.
A case study in using Generative AI tools in a design workflow
I decided to use a recent internal project as an opportunity to explore how AI-generated assets could be used to create a new design. My goal was to create a “movie poster” to commemorate the successful launch of a recent client brand, application UI/UX, and website design project. Our client, rockITdata, leans into rocket-themed imagery and messaging to position and promote their next-generation call center services and technology – so a rocket launch themed poster was the perfect chance to test out what a human designer and AI could do together.
Step 1: The prompt
I had an idea to create a stylized rocket taking off with a huge plume of rocket smoke underneath it. I was going to use the plume of smoke under the rocket to create depth and layer in some of the branded elements of rockITdata. Since this project was more commemorative and fun, I wanted to try a different style, so I thought an oil painting style for the artistic elements would be interesting. The first step was to generate a written prompt to get some assets back from the DALLE -2 (the AI image generation application I used for this exercise) that I could use for my composition. I tried a few prompts and eventually landed on: “an oil painting of red rocket ship lifting off with huge clouds of exhaust at the bottom”
Of the results that DALLE-2 returned, I liked one of them, but it ended up being too closely cropped, so I asked it for some variations.
The results of the variations returned an image of a rocket that I liked, and that I eventually used for my final composition. The options were still too closely cropped for me to work with beyond just the rocket, so I refined my prompt a bit: “an oil painting of red rocket ship lifting off with huge clouds of exhaust at the bottom seen from far away”
After a couple of rounds of variations on images that I thought were promising, I arrived at a set of images that ended up being more of what I was looking for. I chose multiple images to extract rocket exhaust and some clouds from so I could composite them together into the look I wanted to achieve.
Crafting the right prompt is critical to working successfully with generative AI tools. The quality of the output generated by these tools depends heavily on the specificity and clarity of the input prompt. Without a well-crafted prompt, the AI may produce nonsensical or irrelevant results. Creatives with a vision for the end product and with expertise and understanding of the domain in which the AI is being used, are critical to formulating prompts that align with the desired output. Quickly and efficiently generating the right prompts, is key to supercharging the overall efficiency and effectiveness of the generative AI system.
Step 2: Compositing
After selecting the images I wanted to work with, I pulled everything into Photoshop and got to compositing. I extracted the rocket (using Adobe’s AI-enabled object selection tools), I chose the image of the rocket exhaust that I wanted, and the exhaust smoke plumes that I wanted to layer together.
After some time in Photoshop creating the layering and depth to achieve a sense of scale, I was ready to bring some branded elements, such as logo, iconography and asset imagery, to composite into the design.
This is the part of the process where human involvement still shines. Having saved time during the source material gathering process, I was able to spend more time in the creative phase of ideation and composition, leading to a great result in a short amount of time.
Step 3: Finishing Touches
After compositing all the elements, I used color balancing and adjustment tools to bring together the different shades of colors from the various composited images and bring them in line with the rockITdata brand color family. Here’s the result:
Conclusion and implications for what’s next
Overall, I really enjoyed this process and was able to go from idea to final composite very quickly. A process that could have taken multiple days to source or paint my own oil-painted assets, was cut down to less than half a day. A couple of takeaways:
It’s yet to be seen how AI tools impact the design industry as a whole, but it’s clear to me that compelling vision and creative ideas are what’s going to set designers apart from the machines. The ability to utilize technology and generative AI applications to achieve their vision, through a particular art style or illustration method not readily accessible to them, will make designers feel like they have superpowers.
This technology is continuing to bring the “idea people” closer to the end product – possibly as dramatically as when graphic design methods moved from the physical world (typesetting) to the computer world – it is a democratizing force.
I’m excited to see what visions designers come up with as they have access to rapidly experiment with different mediums and styles to incorporate into their work and to bring to bear on client projects to achieve new, interesting outcomes.
Attribution for this type of art methodology is still up in the air – landing anywhere between, AI-generated artwork cannot be copyrighted, to prompt writers own the copyright of the work generated – we’ll have to see where this lands and if we’ll end up having different value ascribed to human-generated vs AI-generated art and design pieces. More on this topic from MIT here.
AI-generated images have the potential to completely replace currently used methods of selecting assets for design compositions. Stock photography and illustration could go away completely if an AI is capable of generating the perfect stock photograph on the fly for exactly what a designer needs.
There is potential disruption in many art and design areas as these technologies mature and get built right into the products we use to create art and digital experiences. It’s certainly an exciting time to be a creative, and it feels fun to have some new superpowers to add to the design process.
In conclusion, the future’s still bright for humans wanting to express creativity, or solve design problems in creative ways… The machines aren’t taking over anytime soon. The combination of talented creatives with the newly added superpowers of generative AI allows for new possibilities in design experimentation, efficiency, and potential.