Developing A New Kind of Education

Students in traditional classrooms aren’t learning how to excel in the dynamic 21st-century job market.

Educators are in an impossible situation where they must prepare students for standardized tests while also facilitating collaborative inquiry-driven learning. There isn’t enough time to do both well with traditional tools.

Educators cannot uniquely tailor lectures to each student, and student confusion during out of class homework can compound, ultimately derailing entire exercises.  How might instructors overcome these challenges to better address each student’s unique needs?

Sapling Insights

Dr. Laurie Parker discusses how Sapling Learning’s approach improves student grades

Student Insights

Student credits Sapling Learning for his A in Economics.

The Future of Learning

Macmillan New Ventures works to grow ed-tech startups that will eventually be the future of the publishing business.

They had just acquired a learning management system used in over 800 schools and had several peer-reviewed studies demonstrating its ability to improve student grades.

Students and teachers loved the platform, but they wanted it on their mobile devices. Unfortunately, most of the core experience was built in Flash and had to be totally reconstructed.

Macmillan recognized this was a great opportunity and was growing an internal mobile team tasked with re-imagining the legacy service for emerging platforms.

Their mobile team had a rough prototype and backlog, but they needed some help with user-centered design.

Learning From the Users

We knew we couldn’t design a better experience for students using the product without getting to know them.

We began by doing some secondary research and learning a ton from Macmillan’s hyper-knowledgable marketing team. Our team also conducted personal interviews and scenario driven user tests that led toward answering one question:

How might we help students learn faster?

These experiments enabled us to cycle rapidly through our hypotheses, validating some and throwing out others. This prototyping cycle gradually refined the question flows as we learned about our user’s behavior patterns and expectations.

We also ran unmoderated usability tests on a population of American graduate students. These tests helped us ensure that core features worked outside of a laboratory setting.

After each exercise, we mapped our findings and calculated significance. This analysis prompted discussions where we hashed out what was currently working and what needed help.

Minimum Viable Product

During our discovery period, we constantly built out ideas and refined them to their core elements. Our experiments helped us to trim the fat and optimize the core learning flows.

With each iteration, we’d generate tons of ideas and ‘diverge’ before we analyzed and focused our findings and ‘converged’ around the fundamentals. These build-measure-learn loops helped us identify where the value was for our users and prioritize features based on those findings.

Minding the Details

Robust ‘Attempt Navigator’

We consolidated the prototype’s broken ‘attempt tabs’ into a navigation widget that could accommodate the entire range of possible attempts.

The navigator also alerts users which attempt they got correct. This significantly improved wayfinding capabilities of users tasked with navigating between past attempts.

Navigation Icon

After multivariate scenario and affinity testing, we determined that users preferred the three bar ‘hamburger menu’ for opening the sidebar question selector.

We were careful to validate the icon using quantitative and qualitative testing methods and are confident that the icon is understood by our users.

Contextual Action Button

Our research indicated that users expected a single ‘next’ button that was always in the same screen position. We initially stumbled by offering our users a single button that advances states on both a question and assignment level.

Our second implementation separated these buttons and served them based on where the user was in their question flow.

What We Learned

Legacy user-expectations prevent interfaces from changing too fast

Macmillan’s platform has thousands of existing users who have product and interface expectations. For us, this meant updating the homework platform required heightened sensitivity to the platform’s history and infrastructure.

These user expectations put us in situations where we would make visual tradeoffs in order to promote user comprehension as the platform transitioned.

Integrating Lean UX into an existing product backlog takes special focus

The uncertain results of user research make planning longer sprints challenging. We learned that shorter sprints allowed the team to respond better by communicating findings in retrospectives and scheduling responses during sprint planning.