I had the opportunity to showcase my UX design and prototyping skills through a project, which challenged me to take an existing kind of app and adapt it to a different context. Although this project was not for a client with a specific target audience, it required me to design iteratively, work with prototyping software, and test my prototypes with real users.
The mobile app I chose to adapt was painting by numbers, and I decided that the new context would be AR (Augmented Reality). Rather than confining a painting onto a phone screen, this new context would allow users to color in a template on their wall. It is important to note that users would experience the product wearing an AR headset and using AR controllers. This is opposed to the original interactions, where users would tap or swipe their phone screens.
Features included:
Painting selection page, where users can select the template they want to color
Painting resizing so that users can fit the template onto their wall
Coloring page, where users select different colors and color the template
The User: I chose the general users to be college students because they were easily accessible for me at the time, and people in this group would likely understand the concept of AR. To find out what users would want for AR painting by numbers, I interviewed my classmates and other students on campus.
User Needs:
A clean interface that is easy to navigate
The ability to backtrack in case users change their mind
A believable prototype fidelity, where users could actually envision the product being implemented in AR
Unfortunately, I was not able to create a prototype using an actual AR set and controllers because there were none available for the class to use, and I could not afford one at the time. Therefore, I used Figma to design and prototype the user experience of my product.
The home screen has all of the paintings users can choose to color. By hitting the left or right arrows, users can change which painting is selected. Then they can select the desired painting. The point of view is as if the user is facing a wall when viewing this experience.
After selecting the desired painting, users are given the option to resize the template to their liking by adjusting the blue dots in the corner of the template. Once they select the checkmark, the resized template gets populated with numbers, and the coloring interface shows.
Users can select colors and color in the associated areas. After selecting the checkmark, they can finish their work and return to the home screen. Selecting the "X" exits the coloring interface, and selecting the back arrow allows users to backtrack on their work. The experience of coloring in the template was definitely prioritized over the complexity of the painting. Because I prototyped this experience frame-by-frame, users need to color the template in a certain order.
To test my AR painting by numbers experience, I asked 5 college students to try it out. I knew that performing a good test requires different users to be given the same task to complete. The task also needs to be specific enough that the users know how to perform the task but not too specific that the users no longer have the freedom to test the usability of the product. All of the participants tested my product on Figma, and I gave them all the same task directions. As they tested the experience, I observed and took notes. I encouraged the participants to speak out loud what they were thinking so that my notes could be more robust.
Task Directions:
Browse the paintings and choose the flower painting
Resize the painting template
Completely color in the template and finish
When users were testing the painting selection page, they said that it was intuitive to select the arrows to change the painting and click on the painting to select it. Additionally, the arrows changed the paintings the way the users expected, which was similar to a carousel. One user told me that only displaying one painting at a time and just showing the arrows hid the other painting choices. As it was, they didn't feel that they had multiple choices and felt uninclined to select the arrows in the first place.
In terms of the resizing experience, all users immediately knew to adjust the blue dots in the corner of the template and then select the checkmark. A couple users told me that they believed the prototype captured what users would see if the experience was in AR. However, the resizing was too sudden that they felt the interaction was not realistic enough.
All of the users told me that the coloring experience was easy enough to understand and follow (selecting a color and coloring in the associated squares). The main limit of this experience was that users had to color in a specific order, and it felt unnatural to them.
Although my task directions did not mention the backtracking buttons or the "X" button, some of the users felt naturally inclined to test that feature out. This was actually something I hoped for. The "X" button and the checkmark button both sent users back the painting selection page, and this was confusing. A couple users told me that if they wanted to exit the coloring interface at any time, they would just select the back button and expect to be at the resizing page. Then, they would select the back button again to return to the painting selection page. They said they wanted to avoid the "X" button because it implied that their work would not be saved.
When users actually selected the back button, they were surprised to see that it simply undid their most current move. They didn't particularly like that because it required so many back button selections depending how far they colored in the template.
Rather than prototyping frame by frame, especially for the coloring interface, I utilized Figma components and smart animate to improve the user interactions based on the feedback from testing. This actually reduced the amount of frames I needed for my prototype.
The painting selection page is now a carousel, where a painting becomes larger to show that the user is about to select it. Users can see the other painting options, and this encourages them to browse their options.
The interactions are smoother with the resizing of the template (visual enlargement). This makes my prototype more believable when it comes to the experience being in AR.
Users can color the template in whatever order they want, which creates a natural user experience with more freedom.
The back button returns the user to resizing, and no longer backtracks a single move. It also remains in the same position so that users don't have to shift their gaze when they want to go back.
After doing another round of testing, users liked the carousel, the template resizing, and the freedom they had when coloring the template. They even appreciated the improved backtracking feature. If I were to take this project further, I would definitely run more tests with a larger sample size. I would also implement the coloring experience for the rest of the templates, and it may be interesting to see how the interactions would change if the templates were more complex with more than three colors. To ultimately take the project to the next level, it would be great if the prototype was implemented in real AR where users would use an AR headset and controls. This would help me finetune the user interactions. With everything in mind, I feel that this prototype was a success because I performed user testing, incorporated user feedback, and created a believable AR experience.