top of page

CHAMOMILE GROVE

Screenshot_5.png

OvERVIEW

Chamomile Grove is a Single Player Hybrid Game where the player has farm ingredients, brew potions and solve puzzles with them.

The game was made in Unreal 4.25 with a team of 18 Students, including artists, composers, designers and programmers, over the span of 8 months, in which I participated only in the last 4.

My focus areas in this project was User Experience Design, although I did some User Interface Design work as well.

Screenshot_11.png

Introduction

Farming/Crafting and Puzzle gameplay types are radically different in nature, which is a difference that is projected onto their relative UI Designs.

 

Once I joined team Apathetic Apothecary, which was working on Chamomile Grove, a farming/crafting & puzzle game with narrative elements, I set my own personal goal of delivering the most cohesive UX & UI I could produce, trying to tie together all these different gameplay styles and their relative UIs in order to deliver to the player a coherent experience.

Screenshot_14.png

Project Goals

For this project, my goals are:

  • Primary Goal: Create a cohesive and consistent UX & UI design across the farming, crafting, puzzle solving and dialogues gameplay.
     

  • Secondary Goal: Research, design, and deliver a polished dialogue UX & UI.

CohesiveUIUX_1.png
CohesiveUIUX_2.png
CohesiveUIUX_3.png

Cohesive UI & UX

When I joined this project, the state of the UI was still using placeholder assets, and the UX design was almost nonexistent.

 

I also found myself to work on a project that incorporated elements from a different variety of gameplay types, such as farming, puzzle and RPG.

My main objective was to deliver a coherent experience that would feel like more than a bunch of different gameplay types and interfaces crammed together.

 

Instead, I wanted every single signifier and piece of feedback to feel as if it’s pertaining to the same world space.

Unsurprisingly (our game was designed to be unique and go beyond the traditional gameplay archetypes), I was not able to find any resource that would cover all my questions, so I had to break it up in the three different more defined gameplay segments: the farming, the crafting and the puzzle segments.

CompResearchFarming_1.png
CompetitorResearch_Farming_3.png
CompetitorResearch_Farming_3.png
CompResearchFarming_4.png
CompResearchFarming_5.png

COMPETITOR RESEARCH (FARMING)

For the farming, our idea was to keep it simple, and allow the player to both take it easy or dash trough it may they feel the need.

 

When I joined the team I found a rough watering mechanic already implemented, but the plants would stop growing whenever they’re not watered, which was a similar system of many reference points for our game, such as Animal Crossing and Minecraft.

The epiphany came when I discovered the single harvest crop system in Stardew Valley, which grow regardless if you water them or not.

 

So I decided to allow the players to pick their pace by allowing the plants to keep growing even if neglected, but, if properly taken care of, they would grow at a much faster pace, allowing player to both plant a harvest, go onto solving one of the game puzzles, and then come back with a fully grown harvest, or grow and harvest them all in one go.

However, that being done, a sequence of signifiers and feedback elements had to be introduced in order to show the player the current watering levels and growth process.
Oddly enough, the answer for that came from Dakka Squadron, a relatively niche Warhammer air combat game: the way that the HP counter went down created a very pleasant contrast with the heat counter going up, often resulting in a dynamic visual balance of the two UI elements.


I implemented a similar approach with the design of our growth and watering bars, having the former going up and the latter going down.

Some further passes on the growth and watering bars replaced their standard rectangular shape with a reverse triangle plant shape for the former and a regular triangle water drop shape for the latter, with the aim of doubling down on the visual asymmetric balance of the two elements.

CompResearchCrafting_1.png
CompResearchCrafting_2.png

COMPETITOR RESEARCH (CRAFTING)

Crafting was originally envisioned to empower the player to create a high number of potions, and was heavily inspired by systems similar to Minecraft or Magicka 2 spells.

 

However, the UX design of crafting system I was presented with when I joined the team felt flat and complicated.

 

To juice it up, I put some research in games such as Minecraft, Rust, and even Fortnite to get a broader picture on how to structure a crafting system inclusive of a recipe book and synergizing it with an inventory bar, without having to resort to a player’s internal inventory, as it would complicate a game that was designed to be simple and streamlined.

After structuring the recipe book using Minecraft’s layout (recipes on the left page and ingredients on the right) as a point of reference, I moved onto improving the current worldspace UI piece from which the player would craft the potions and juiced it up using dynamic sprites signifiers.

 

For that part, no competitor was able to provide useful insights, so I mostly relied on Swink’s book Game Feel and prof. Jami Lukins’ feedback.

CompResearchPuzzle_1.png
CompResearchPuzzle_2.png
CompResearchPuzzle_3.png
CompResearchPuzzle_4.png

COMPETITOR RESEARCH (Puzzle)

Designing the UX for the puzzle gameplay segment was the toughest task among the three that I mentioned, as reference material with a similar gameplay was really scarce.

 

To tackle such lack of examples I had to resort to a general guidance provided by the book Game Feel, and then, I tried to analyze the work we already had in the farming/crafting section to see if we could reuse some of the assets/systems to provide clear guidance in the puzzle levels.

That course of action had two benefits: the first one is that it would reduce production time, as we simply recycled and modified previously existing assets to fit them to their new purpose.

 

The second one is that it would start to build some form of visual and conceptual consistency across different gameplay segments, and so I opted to design a series of item interaction pop-up signifiers right above the player’s head, replicating the system that was already in place for farming and adjusting it to the circumstances of the puzzle levels.

UI_Updated_AttributeBook.png
UI_Brewing_Box.png
UI_Button_Normal.png
Updated_Ticket.png
UI_NewPotionCard.png
UI_ExitConfirm.png
UI_Title.png
UI_ObjectiveMarker.png
UI_Paused.png
UI_Options.png
UI_ShopBox.png
UI_SelectionMarker.png
UI_DialogueBox.png
UI_ItemBackground.png
Button_Highlight.png
Brewing_Highlight.png
Background_Highlight.png

IMPLEMENTING A COHESIVE SYSTEM

As previously mentioned, the after nailing down the specific UI and UX design for each gameplay type, I needed to bring them together under the same umbrella, or, in other words, make it coherent and cohesive across different gameplay types.

To do so I identified three elements that needed to be worked on: a consistent usability flow for different interfaces, a clear visual consistency in interactable and non-intractable UI elements, and standardized visuals to signify a selection, may that be in the 3D space, on UI elements, or in some cases in both at the same time.

To create a consistent usability flow across all the different interfaces (shop, pocket portal, recipe book and storage), each one of them was simplified, with sub-menus being removed and implementing an identical “go back” button/icon (depending if the player is using mouse & keyboard or controller) in each of them in order to provide a standardized core framework that would reduce the cognitive load required by the player to navigate those menus through the concept of pattern repetition discussed in the book Game Feel written by Swink.

To double down on the repeating pattern in the menus, I collaborated with the designated UI artist Michelle Deitrik (Portfolio Website) in to create clear visual elements of interactable and non-interactable UI, starting from color theory.

And so every interactable and more specifically selectable UI elements (such as buttons, storage slots, recipe slots and dialogue answer panel to cite some) would always feature a darker outline (later standardized with a dark brown color) on a lighter (later standardized as light tan) background panel, that might include one or more selectable elements, while non interactable UI elements (such as the available quest icon, the dialogue pane and the inventory bar background to cite some) would feature a darker main body (mostly brown) using light colors as accents, in order to create a clear contrast separation form the interactable ones.

During further development on the visuals of the UI, I collaborated with Michelle to create a uniform visual language with patterns and textures – for example, selectable UI elements are always on a tan, parchment like background texture, with brown font color on it in case the aforementioned UI elements feature any form of text, while for non interactable elements (such as frame boxes and dialogue panels), the background color is a darker (generally brown) with an inner lighter desaturated frame to provide some accents and a downwards shadow to make them look less “flat”.

Designing a universal highlight system that would behave in similar ways in both 2D and 3D space was a bit more challenging - many ideas were brainstormed and as many were thrown away, because they were either not applicable, very hard to implement or simply they didn’t look and feel good once implemented in the game.
 
During that stage of development, buttons would simply use a darker tint and change in size to signify that they have been selected or hovered, 3D meshes would use a bright yellow outline to signify that they are interactable, and text would simply have an arrow icon next to the answer to be selected.








 
In this case, the answer to my problems came through testing.


 







So I ended up implementing a turquoise outline for all the selectable 3D meshes (puzzle elements, crops, cauldron and so forth), creating consistency with turquoise glowi outline for 2D UI elements (buttons, inventory squares, recipes and so forth) and the turquoise glow for selectable text (dialogue boxes), providing a uniform signifier for selection.

ImplementingCohesiveSystem_1.png
ImplementingCohesiveSystem_2.png
ImplementingCohesiveSystem_3.png
ImplementingCohesiveSystem_4.png
ImplementingCohesiveSystem_5.png
ImplementingCohesiveSystem_6.png
DialogueUXUI_1.png
DialogueUXUI_1.png

DIALOGUE UI & UX

Our game has dialogues between characters, but unlike many other UI interfaces, a dialogue one requires a different design approach, as it has different gameplay needs than, let’s say, a storage or inventory menu.


The dialogue UX and UI design didn’t need only to be functional, but it also had to reinforce the narrative and accommodate the needs of our narrative designer, Alora Newbury (LinkedIn Profile), allowing her to convey the narrative in the ways she deemed fit.

The goals for the UX and UI design of this interface were the following:

  • To have good interface usability

  • To be character-centric

  • To synergize well with narrative designer’s work
     

In order to take into account these factors, I had to design something that would follow all the principles of good UX design, that would change dynamically according to the character that is talking and that would accommodate the narrative designer needs.
 

In the following sections I will explain in detail how I considered all of these priorities during the implementation process and what was the final outcome.

CompResearchDialogue_1.png
CompResearchDialogue_2.png
CompResearchDialogue_3.png
CompResearchDialogue_4.png

COMPETITOR RESEARCH

Creating a dialogue UX and UI design from scratch proved to be a challenge, especially because I’ve personally played few RPG games in my life.

 

But instead of looking at western games, I felt that a more JRPG-esque layout would better fit the aesthetics of the game, so I started doing research and taking references from eastern games such as the Persona franchise, Rune Factory, Animal Crossing, and a Pinterest gallery that collects multiple examples of JRPG dialogue UI.

On top of that, I also established a communication channel with prof. Jami Lukins (Portfolio Website) to receive constant feedback on the initial UI wireframes and medium to high fidelity mockups later on.

ProtoypeWireframes_1.png
ProtoypeWireframes_2.png
ProtoypeWireframes_4.png
ProtoypeWireframes_5.png

PROTOTYPE WIREFRAME

The first step, after gathering enough reference material, was to sketch a couple of low fidelity paper wireframes and run them through the narrative designer for feedback, focusing more on her needs rather than usability, which is something I worked on later on.

Once both me and Alora agreed on a base foundation for the layout, which included only one character portrait displayed on screen at any time and a pop-up answer box, I moved onto producing a series of medium fidelity greyscale wireframes variants, in order to refine the layout and visual hierarchy of the UI elements, getting weekly feedback from prof. Lukins in order to improve the usability of those menus, experimenting with panels, shadows, sizes and navigation flows.

After three iterations, the feedback received was good enough for me to move onto the next phase, implementation, and start to work with the designated UI artist Michelle Dietrich to produce assets that would both fit the aesthetic of the game and of the wireframes produced.

A great source of inspiration was the Persona 4 dialogue UI layout, with a text panel, name box, character portrait and a pop-up answer panel that can easily be hidden without modifying the whole layout may the occasion arise.

Implementation_1.png
Implementation_2.png
UI_Screenshot
UI_CamilleSprite_Happy.png
UI_CamilleSprite_Mad.png
UI_CamilleSprite_Neutral.png
UI_CamilleSprite_Sassy.png

IMPLEMENTATION

By the time it was time to implement the interface into the game, we already developed the aforementioned guidelines for interactable and non-interactable UI elements, which are described in the Implementing Cohesive System section of this paper, so, by looking at the planned layout, Michelle and me worked together on tying a specific pattern to each section, when applicable.


So the text box, which was not interactable, was given a dark gradient background, with an off-white, spaced serif font (Yrsa) and a shadow backdrop, both with aimed at providing contrast and increase readability.

In regards of the answer box, we agreed on using the aforementioned parchment background with a brown serif font, which would be highlighted upon selection, in order to stick to the interactivity visual guidelines defined in one of the previous section.


Since the narrative designer wanted to give the player the option to reply to the NPCs with three different emotions, we had to find a way to first, standardize the type of dialogue options, and second, find an appropriate signifier for those.

The solution was partially inspired by Dragon Age’s dialogue wheel: such system would not fit the gameplay and aesthetics of our game, however small, simple icons could have been the answer we were looking for, and so I designed an emoticon system that, paired with the narrative designer text input, would accurately convey a preview of the three possible dialogue options available and their respective emotion.

 

This was also possible by collaborating with the narrative designer in creating a short “text previewfor each dialogue option, from which the player’s character would build on during their dialogue sequence. (e.g. the “Not now” dialogue option became “Fiz, not now, can’t you see I’m busy trying to save a forest?” once the main character dialogue sequence would appear on screen). This has been inspired by Cyberpunk 2077 dialogue system.

While not originally planned, the dialogue system was further expanded with a character’s facial expression feature, as both the main character and the NPCs would change their facial expression in their portrait based on the emotion of the sentence they’re saying, further solidifying the narrative purpose of this interface. This feature has been inspired by Persona 5 dialogue system.

An animation system was designed to “peggle-up” the dialogue interface and make it juicier (with panels and sprites sliding in and out of the scene, and adding a sliding animation to the answer box), however, we unfortunately ran out of development time to implement it. Nevertheless, a Unity prototype has been created and will be used by the next UI/UX designer on the team as a guideline for the future.

EXPERIENCES & INSIGHTS

The collaboration with the designated UI artist went exceptionally good, we both shared the same vision for the UI systems and each UI element was constantly going back and forth between the two of us until it reached a state that would satisfy both of us, which meant that it was both aesthetically pleasing and functional.

 

The direct communication channel we set up allowed us to do rapid fire iteration across all assets and allowed us to be on the same page at all times.

 

It felt almost as if we were working in synchronous, and it was a truly rewarding experience.

Another thing that went really well was that I was able to cover all the core areas that I wanted to cover: at the beginning of the semester, when I joined the project, I was instructed by the creative director about all the functionalities that she wanted in our game, and over the first week I realized that I had a high number of menus that I had to wireframe, iterate on and implement in order to accommodate her requests. I was afraid, given the development time we had, that I would not be able to finish them in full. I was wrong.


So, instead of developing each menu from start to finish before moving to the others, aware that that was mistake that I’ve done in previous courses, I decided to start parallel development of all of them at the same time, which allowed me to better identify potential timesinks and develop strategies on how to avoid them.

Another benefit of this method is that I shipped a game with equally polished interfaces: the previous approach that I used to have led me to ship products with one or two very well polished interfaces, that really badly contrasted with the ones that I didn’t have as much time to work on, creating a notable visual inconsistency.


This time, given the scope and development time of this project, I decided to sacrifice some polish time on some of the more common recurring interfaces to be able to elevate all the interfaces to the same level of polish before starting the successive polish pass, considering this tradeoff beneficial to the project as a whole.

WHAT WENT WELL

The project I worked on was made in Unreal, and, unfortunately, that was my first time approaching the engine.

 

Due to my inexperience, I wasn’t able to identify some pipeline mistakes left by the previous UX/UI designer on the team until the last milestone, when it was too late for me or for any other member on the team to fix them, since many of the widgets were already hooked up with multiple blueprints, and we couldn’t afford the time needed to restructure them.

 

A concrete example of the consequences of that was that I had no way to have precise, pixel-perfect measurement of the placement and size of the widget within the canvas, and I had to resort to a lot of eyeballing, which ended up being a massive timesink and yielded average results.

In order to make room for more time to allow me to eyeball the size of every UI element (including every single in-game button), I had to cut the creation and implementation of UI animation from my schedule. I still wrote a design document about them, as well as a 1-to-1 fidelity interactive mockup in Unity (an engine I’m more familiar with), but it will be up to the designer that will come after me next year to actually implement them during their run on this project.

 

In retrospect I would have probably spent more time in learning the engine early on, perhaps cutting secondary design work to make room for it, as it would have allowed me to have better optimize my efforts and impact on the project.

Another thing that didn’t work as planned was the particle asset implementation: my knowledge of the engine was too limited for me to be able to work on particle prototyping until it was too late (pretty much a week before art lock date).

 

This inability of implementing particle prototypes in the project from early on led me to create a UX that relied very lightly on particles to provide feedback, resulting a bit too much static, lacking the dynamism and the juiciness that particle-heavy UX systems usually have.

Overall, the research conducted over the span of the last four months, and the subsequent implementation of the designed UX has proven an illuminating experience for me, as I never had to work on so many user interfaces before (my UX work mostly specialized in gamespace feedback rather than proper menus), and the wide range of interface types was surely a challenge.


I learned a lot about standardizing the overall UX experience of a game and integrating that into the UI system, especially when it comes to inventory management, farming and crafting, by using a system of templates that include font, shape, color and texture repetition that would ease the player’s cognitive load.

I also learned how to design a dialogue UX and UI aimed at accommodating the narrative needs of the game, and the importance of close collaboration with UI artists and narrative designers in order to deliver a fine-tuned, coherent experience.

Of course, not everything went smoothly, and there were surely some hiccups during production, nominally not being able to implement as many particle effects and UI animations as I wanted to, and while some of that was out of my control, I still made mistakes in adopting a workflow that did not adapt well to my inexperience with Unreal Engine, which led me to overfocus in some areas (like inventory interfaces) while leaving other behind (like UI animations), resulting that not all the features I designed ended up being implemented in the game at the time of delivery.

To conclude, I can confidently say that I currently have a broader and deeper knowledge on the topics discussed in this research paper, as well as knowledge of the limitations of the workflow method that I used until now, which will be addressed, modified, and iterated on depending on the nature of my next project.

What could have gone better

bottom of page