top of page

SnapChem

AR Chemistry Class

Project Type:

Hackathon Team Project

Role:

UX/UI & Spatial Experience Designer · AR Prototyping · Pitch/Storytelling

Platforms:

Spectacles · Lens Studio · Snap Cloud · Figma · Generative AI

Location:

Reality Hack at MIT - Massachusetts Institute of Technology

Timeline:

3-4 days (Hackathon sprint)

SnapChem

AR Chemistry Class

Project Type:

Hackathon Team Project

Role:

AR Product Designer
Spatial Interaction Designer
Pitch/Storytelling

 

Platforms:

Snap Spectacles
Lens Studio
Snap Cloud
Figma
Generative AI

Location:

Reality Hack at MIT

Timeline:

3-4 days
(hackathon sprint)

Project
Overview

SnapChem is a context-aware AR learning experience designed for students and educators, enabling them to understand and retain complex chemistry concepts by interacting with spatial, multimodal content directly embedded in their real-world environment using Snap Spectacles.

SayraKurtoglu_XR Guild.pdf - Page 2 of 5.png
SayraKurtoglu_XR Guild.pdf - Page 3 of 5.png
SayraKurtoglu_XR Guild.pdf - Page 4 of 5.png

Problem:

Idea:

Indoor navigation makes people stop, hesitate, and constantly “re-orient” using 2D maps.

Replace map reading with instinctive following through a calm companion and environment-anchored cues.

Solution:

Outcome:

A cat guide + ground-anchored Paw Trail + Target Point that reinforces spatial trust.

A coherent spatial UI concept + prototype direction for hands-free wayfinding on smart glasses.

22.jpg
WhatsApp Image 2026-03-22 at 23.37.27.jpeg

speaker & live demo

speaker & live demo

speaker &
live demo

As part of the XR Guild NYC event, I was invited as a speaker and demo presenter. I had the opportunity to introduce the project to the XR community and facilitate live, hands-on experiences, allowing participants to directly engage with the spatial interaction design and learning concepts in real time.

Reflections

Designing for Spatial Trust
I learned that in XR, “clarity” is not just visual—it’s behavioral. Consistent anchoring, scale, and occlusion rules make guidance feel believable, which reduces hesitation and keeps users confident without adding UI noise.

Calm UX beats “More UI”
The biggest improvement came from subtracting, not adding: keeping cues lightweight, peripheral-friendly, and context-aware. Minimal prompts at the right moments created a smoother experience than constant, map-like instructions.

Anchoring is a Design Decision
World-locked, head-locked, and camera-locked elements each serve a distinct purpose. Choosing the right lock mode at the right time (route cues vs. confirmations) helped the interface stay comfortable, readable, and aligned with spatial UI best practices.

What did I learn from this project?

bottom of page