Rethinking Banking for Mzansi: My Journey Through the ABSA Hackathon
Back in early 2023, I found myself in one of those long ABSA queues in Sandton – you know, the kind that make you question every life choice. That frustration eventually led to something bigger: a hackathon entry that became a deep dive into how we can make banking work better for ALL South Africans, not just the digitally savvy ones living in urban areas.
This project taught me more about inclusive design than any textbook ever could. Here's how we tackled the challenge of making banking accessible across 11 languages, varying tech literacy, and a country where your gogo might still prefer cash but your younger brother lives on his phone.
The Reality Check
Getting clear on what we're actually solving (and for whom)
The Problem I Couldn't Ignore
Business Reality
ABSA and other major banks are caught between a rock and a hard place: digital transformation pressure vs. customer preference for human interaction.
The constraint: Reduce operational costs without alienating loyal customers who built relationships over decades.
Success Metrics
Reduce wait times: From 23 minutes to under 15
Increase satisfaction: 6.2 to 8.0/10 (being realistic here)
Language inclusivity: Support for 9 languages (started ambitious, scaled back)
Task completion: 85%+ for users 55+
What I Discovered
This wasn't just about making things digital. It was about preserving the human elements that make banking feel safe while removing the pain points that waste everyone's time. The challenge was finding that sweet spot.
Getting out of my comfort zone and into real banking queues
What I Actually Did (vs. What I Planned)
Branch Observations
Spent probably too many hours in bank queues with a notebook (and got some weird looks).
Key insight: The queue isn't just waiting time – it's social time. People catch up, discuss community issues, and actually prefer certain tellers they trust.
Customer Conversations
Spoke with everyone from entrepreneurs in Soweto to retired teachers in Stellenbosch. Most enlightening conversations happened over coffee, not in formal interview settings.
Surprise finding: Language preference isn't just about understanding – it's about feeling respected.
Staff Insights
Bank staff were surprisingly candid about their frustrations. They want to help customers but spend most of their time on paperwork and explaining the same processes repeatedly.
Aha moment: Staff satisfaction and customer satisfaction are deeply connected.
What I Learned (Sometimes the Hard Way)
Making sense of messy research and finding patterns
Meet the People Behind the Data
Age: 34 | Lives: Roodepoort | Languages: isiZulu, English
The Reality: Works in IT but still prefers seeing a human for anything involving large amounts of money. Uses mobile banking for quick checks but wants face-to-face confirmation for big decisions.
Age: 63 | Lives: Chatsworth | Languages: English, Tamil
The Reality: Retired principal who's actually quite tech-savvy (runs the school's WhatsApp group) but doesn't trust banking apps with her pension. Values relationships she's built with bank staff over 30 years.
Age: 28 | Lives: Limpopo (travels to Polokwane) | Languages: Sepedi, English
The Reality: Runs a small transport business, manages cash flow via mobile money but needs branch visits for business banking, loans, and cash deposits. Travels 45 minutes to nearest branch.
The Journey That Changed My Perspective
Current reality: "Do I have all my documents? What if their system is down again? Should I go early or will I waste my whole morning?"
Opportunity: Pre-visit confidence building through preparation tools and realistic wait time estimates.
Current reality: "How long is this going to take? Do I have time? Should I come back later?"
Opportunity: Clear communication and alternative options based on wait times and transaction complexity.
Current reality: Either great (you get someone who knows you) or frustrating (explaining your situation from scratch to someone who seems rushed).
Opportunity: Better staff preparation and customer context, without sacrificing privacy.
Current reality: "Did I really need to come here for this? Will I remember what they told me?"
Opportunity: Clear next steps and educational content that builds confidence for future self-service.
Turning insights into solutions (with several iterations)
Design Principles That Actually Mattered
Not just wait times, but "come back at 2pm for a 5-minute wait" recommendations
Transaction-specific document checklists and step-by-step prep guides
Matching customers with staff based on language and cultural preferences (when possible)
Gradually building digital confidence through positive experiences
Building to learn, not to impress
Prototype Evolution (What Actually Worked)
Paper & Conversations
Sketched basic flows on paper and tested them with 12 people in actual bank queues. Security guards were initially suspicious, but customers were surprisingly generous with feedback.
Key insight: People's mental models of "preparation" were very different from my assumptions.
Interactive Prototypes
Built clickable prototypes in Figma and tested with families (kids helping parents, grandparents asking questions). Learned more from these sessions than any formal usability test.
Reality check: Voice navigation was popular in concept, challenging in practice with ambient noise.
Wizard of Oz Testing
Simulated the "smart queue" system manually for a few days, updating wait times by hand. Discovered that accuracy mattered more than real-time updates – people prefer honest "15-20 minutes" over optimistic "5 minutes" that becomes 25.
Learning from real people in real situations
Testing Reality vs. Research Plans
From hackathon concept to implementation reality
Implementation: How Things Actually Went
Started small: 3 branches in Sandton, Bellville, and Polokwane
What we adjusted: Pretty much everything based on real usage data
Scaling carefully: 15 branches with refined systems
What I'm still learning from ongoing usage
Ongoing Learning (The Project That Keeps Teaching)
What Actually Happened vs. What I Hoped
Average transaction time
(Target was 40%, but 32% is still meaningful)
Customer satisfaction
(Up from 6.2, target was 8.5)
Positive feedback about branch visits
(This surprised everyone)
Task completion for 55+ users
(Target was 85%, working on it)
The Interface That Emerged
These aren't the polished screens I started with. They're the ones that actually worked after months of iteration with real users in real branches.
What This Project Taught Me About UX