Beyond data: Helping students succeed

Responsive app to help teachers manage student's information

Summary: Educators need a simple way to follow the performance of their students. After getting feedback, our vision of an app to manage grades evolved into a tool for educators to help their students succeed.

My role: I led the design process, from research to high-fidelity design, working together with a PM and engineers.

Aula Education is a social learning platform targeting higher education. As part of its ecosystem of applications, I was in charge of designing a tool to help educators at universities assess and guide their students to academic success. Initially, it was meant to be an application to help them manage raw data about their students (grades, to be more specific), but soon we realised that they need more than that.Unfortunately, this project wasn’t implemented, but it illustrates a good example of the design process that I generally follow up to the implementation phase. I led and executed the entire design process, working with a project manager (PM) and a technical leader (TL).

The problem

Educators need a way to follow and assess how their students are performing in certain classes. During interviews with educators (US & UK), we found that managing student’s data is the least fun part of education. This is especially true if you’re in charge of hundreds of students, and at the end of the day, data only doesn’t lead to many actionable insights.We also found that the way they manage this data is not consistent or standardised (sometimes even within a single institution) and with information scattered across different apps and tools. Moving this information across different systems also seems like an ordeal when they are facing tight school deadlines. As part of this broken system, they are missing opportunities to identify and help their students achieve success.From the students’ perspective, they need to know how they are performing in a certain class and identify areas where they can do better. For this, we wanted to ensure that both educators and students can collaborate and tackle success together.


Our biggest challenge was to provide a cohesive experience, including the unification among all the different sources of information. Educators deal with different apps to create assignments and to record the grades, then they manually calculate the final grades (using their own grading system), write them down on paper to manually digitalise them later, and send emails to their students with feedback, etc. That takes a lot of their valuable time and energy, and let’s face it, motivation.The solutionA connected application for both educators and students, that allows educators to manage information about the students’ performance while also identifying important trends and patterns.

The process


Along with the PM, we started the discovery phase by trying to understand the problem we’re trying to solve and who we’re building this for. We gathered information from within the organisation, performed competitive analyses (looking at what others are doing), learned about the current tools educators and students were using, defined high-level goals, and identified our hypotheses. Later in the process and after talking to our users, we’d define what success meant for this project.


After this, we quickly started sketching collaboratively (and remotely) on a virtual whiteboard. At this stage, there was a lot of collaboration and feedback between the PM and myself. We did a couple of iterations until we had a set of options we thought made sense and that we could start testing as soon as possible. We also started to introduce the idea to different stakeholders, including the lead engineer.

Onboarding flow

Early sketches

Prototyping and testing

I developed two quick prototypes using Figma to validate assumptions and explore different directions. These prototypes represented various options for the most solid direction we identified. Our goal was to interview educators and students to understand their needs, struggles, and opportunities, and to test our concepts. I led the planning of the test, including recruitment and remote session execution. Using a standard template, I outlined key questions and objectives, which I shared with the PM and VP of Product for feedback. With support from our team, we recruited participants matching user personas and conducted two rounds of interviews, incorporating improvements between sessions. As the main moderator, I facilitated interviews while the PM served as co-moderator and note-taker. These sessions combined discovery questions and usability tests with prototypes. We also involved the lead engineer as an observer to understand user rationale and share insights for iterative improvement.

I compiled a concise report and collaborated with the team to synthesize our findings, sharing insights with PMs, designers, and the VP of Product. This enabled us to collectively analyze the information, challenge assumptions, and align our goals. Through this process, we recognized the need to provide more value beyond managing raw data. We then involved the engineering team to provide context and direction for the project. Presenting the design direction, including wireframes and prototypes, allowed for feedback on potential technical constraints. To ensure an efficient building process and iterative improvement, we divided the work into milestones, involving the PM, lead engineer, and myself.

Detailed design

After understanding what we needed to improve, I adjusted the user flows and started designing in higher fidelity. I started this stage by defining the high-level application structure using responsive grids, followed by the design of each screen, while leaving more specific definitions -like colour palette- for later. We also worked on the copy, making sure we talked with the right voice and tone, and to use concepts that educators and students felt familiar with.

Responsive grids for multiple screen sizes
Defining the app structure
Example of interaction in data table
Student's view
Specs  Details

I also involved my design colleague (working in another product area) in regular design critique sessions. I used our existing UI library as a reference, which made my work much more efficient. There was also a continuous back and forth with the PM for feedback and questions, and with engineers for more detailed technical input.From this detailed design work, I also created a simple checklist for delivering accessible designs, since we needed to be compliant with the WCAG 2.0 AA standard. The goal of this checklist was also to serve as a quick guideline for the team to have as a reference for future work. This was shared with the team and I collected their feedback and iterated on it.


Although this project was not implemented and so unfortunately we won’t know the impact of it in our users’ lives, but we were able to gain a deeper understanding of our users, their needs and pains. We saw firsthand their process and learned what was most important for them to achieve: their students’ success.‍