Summary: Educators need a simple way to follow the performance of their students. After getting feedback, our vision of an app to manage grades evolved into a tool for educators to help their students succeed.
My role: I led the design process, from research to high-fidelity design, working together with a PM and engineers.
Client: Aula Education.
Tools: Pen & paper, Miro, Balsamiq, Sketch, Abstract, Figma.
Aula Education is a social learning platform targeting higher education. As part of its ecosystem of applications, I was in charge of designing a tool to help educators at universities assess and guide their students to academic success. Initially, it was meant to be an application to help them manage raw data about their students (grades, to be more specific), but soon we realized that they need more than that.
Unfortunately, this project wasn’t implemented, but it illustrates a good example of the design process that I generally follow up to the implementation phase. I led and executed the entire design process, working with a project manager (PM) and a technical leader (TL).
🤔 The problem
Educators need a way to follow and assess how their students are performing in certain classes. During interviews with educators (US & UK), we found that managing student’s data is the least fun part of education. This is especially true if you’re in charge of hundreds of students, and at the end of the day, data only doesn’t lead to many actionable insights.
We also found that the way they manage this data is not consistent or standardized (sometimes even within a single institution) and with information scattered across different apps and tools. Moving this information across different systems also seems like an ordeal when they are facing tight school deadlines. As part of this broken system, they are missing opportunities to identify and help their students achieve success.
From the students’ perspective, they need to know how they are performing in a certain class and identify areas where they can do better. For this, we wanted to ensure that both educators and students can collaborate and tackle success together.
🤹♀️ Challenges
Our biggest challenge was to provide a cohesive experience, including the unification among all the different sources of information. Educators deal with different apps to create assignments and to record the grades, then they manually calculate the final grades (using their own grading system), write them down on paper to manually digitalize them later, and send emails to their students with feedback, etc. That takes a lot of their valuable time and energy, and let’s face it, motivation.
💊 The solution
A connected application for both educators and students, that allows educators to manage information about the students’ performance while also identifying important trends and patterns.
👩🏻🔬 The design process
Discovery: Along with the PM, we started the discovery phase by trying to understand the problem we’re trying to solve and who we’re building this for. We gathered information from within the organization, performed competitive analyses (looking at what others are doing), learned about the current tools educators and students were using, defined high-level goals, and identified our hypotheses. Later in the process and after talking to our users, we’d define what success meant for this project.
Ideation: After this, we quickly started sketching collaboratively (and remotely) on a virtual whiteboard. At this stage, there was a lot of collaboration and feedback between the PM and myself. We did a couple of iterations until we had a set of options we thought made sense and that we could start testing as soon as possible. We also started to introduce the idea to different stakeholders, including the lead engineer.


Prototyping and testing: With those ideas, I built two quick prototypes (using Figma) to help us validate our assumptions. These two prototypes reflected the different options for the direction we thought was the most solid. Now we were ready to get feedback from users!
The goal was to interview educators and students to better understand them, their struggles, find opportunities, and test our concepts. I was in charge of planning the test, to help recruit and execute the sessions remotely.
To plan it, I used my usual template for answering the questions: what, why, who, where, when, and how. Then I shared it with the PM and the VP of product (who was aware of the project and goals) for feedback. Once we agreed on the basics, we continued by listing the key questions and what we wanted to learn from our users: Is it easy to use? Are we answering their questions? Are we missing a big opportunity? Is the onboarding helping them see any value? Are we using the right terms and concepts?
Once the initial plan was ready, we sought help from our support team for recruiting the right candidates. They helped us find educators and students that matched the profiles we needed (based on our user personas). We sent the invitations and started scheduling the remote sessions. We ended up having enough participants to do two rounds of interviews with improvements in between.
During the interviews I was the main moderator, while the PM was assigned the role of co-moderator and note-taker. These interviews were a mix of discovery questions and usability tests using the prototypes. At this point, we also included the lead engineer in the process as an observer, as we wanted him to understand the rationale behind upcoming decisions and to get to know our users. Also after each session, we shared our observations and gave each other feedback on what to improve for the next interview.

I prepared a quick report, we synthesized the findings together and shared the insights with the rest of the team (PMs, designers from other product areas, and VP of product). By sharing the findings with the rest of the team, we were able to digest the information, challenge our assumptions, and adjust our goals together. This is when we realized, based on the findings, that we were on the right track, but that we needed to offer more value than just managing raw data.
Here we included the rest of the engineering team to give them the context of what the project was about and the direction we wanted to go. I presented the design direction (with wireframes and prototypes) to the team and answered any questions or concerns. They provided feedback on potential technical constraints, without getting into too much detail that would limit the ideal solution.
While iterating on the solution that we were satisfied with (at least for a first version), we divided the work into milestones to have a more digestible building process, one that would allow us to deliver value at each point and learn from our users fast. This was done between the PM, the lead engineer, and myself.
Detailed design: After understanding what we needed to improve, I adjusted the user flows and started designing in higher fidelity. I started this stage by defining the high-level application structure using responsive grids, followed by the design of each screen, while leaving more specific definitions -like color palette- for later. We also worked on the copy, making sure we talked with the right voice and tone, and to use concepts that educators and students felt familiar with.






I also involved my design colleague (working in another product area) in regular design critique sessions. I used our existing UI library as a reference, which made my work much more efficient. There was also a continuous back and forth with the PM for feedback and questions, and with engineers for more detailed technical input.
From this detailed design work, I also created a simple checklist for delivering accessible designs, since we needed to be compliant with the WCAG 2.0 AA standard. The goal of this checklist was also to serve as a quick guideline for the team to have as a reference for future work. This was shared with the team and I collected their feedback and iterated on it.
✅ Outcomes
Although this project was not implemented and so unfortunately we won’t know the impact of it in our users’ lives, but we were able to gain a deeper understanding of our users, their needs and pains. We saw firsthand their process and learned what was most important for them to achieve: their students’ success.