The Origin Story
It started with a different idea entirely.
A while back, I built a Chrome extension called Claudify—a toolbox for Claude.ai with a PDF creator that users loved. It was useful, functional, and solved a specific problem. But as with all creative endeavors, one idea sparked another.
My girlfriend and I were discussing the future of these tools. AI was booming, and I remember thinking aloud, "What if we created a single tool that could store all these AI chats?" That was the seed—a centralized hub for your AI interactions.
But as we peeled back the layers, we realized something bigger. Why limit ourselves to just AI chats? If we're building a home for chat pages, why not make it capable of storing any website we visit?
And just like that, Luma Bookmark Manager was born.

The final product: A folder-based bookmark manager that feels like a native file explorer.
The Discussions
This wasn't just about coding—it was about crafting.
Back during our time at the University of Moratuwa, my girlfriend and I would grab a coffee at the "Wala" canteen and head to a place called "Lagaan"—a beautiful, open theatre-looking area filled with trees and the sound of wind. A place of quiet and peace.
We'd sit there, sipping our coffee, and just talk. We discussed the vision. Sketched user interfaces on notebooks. Debated features.

Early sketches and UI ideas from our coffee-fueled design sessions.
But most of our time was spent on the system architecture. We had a clear goal: build something incredibly cost-effective for us to maintain, yet powerful enough to provide every feature a user would need.
The Philosophy: Local-First
"Occam's razor suggests that the simplest explanation is usually the best one. In software engineering, the simplest architecture is often the most performant one."
When designing Luma's file system, we had a choice: build a complex, graph-based structure for nested folders, or keep it dead simple.
We chose simple.
The Flat-File Architecture
It's tempting to model a file system as a nested tree of objects—folders containing arrays of children. While intuitive, this structure becomes a nightmare for performance. Deeply nested updates require tree traversal, and syncing with a backend like Google Drive is error-prone.
Instead, we adopted a Flat-File System:
- Every folder and bookmark is a standalone record
- Hierarchy is maintained through a
parentIdreference - A folder at depth 10 is stored exactly like a folder at depth 1
No nesting in the database layer. Just flat records with pointers.
Why This Matters
When you open a folder, we don't traverse a tree or recursively search through JSON. We simply ask the database:
"Give me all items where parentId is X and userId is Y."
This is an O(log n) operation—effectively constant time—regardless of folder depth or total file count.
IndexedDB + Smart Caching
Our local data layer rests on three pillars:
- IndexedDB — Persistent local storage with composite indexes
- Zustand — In-memory state management
- Lazy Loading — Load data only when needed
The Fetch Strategy
When you navigate to a folder:
- Check Memory — Is this folder's content already cached?
- If Yes — Render immediately. Zero latency.
- If No — Fetch from IndexedDB, populate cache, then render.
This creates a "warm" cache for frequently-used directories while keeping initial load time near-instant.
The Sync Engine
A local-only app has a fatal flaw: lose your device, lose your data.
We needed backup and sync without compromising our core values: Speed and Privacy.
Most apps solve this by uploading data to their servers. We took a different path. We built a bridge connecting your local Luma database directly to your personal Google Drive.
Event-Driven Architecture
When you create, rename, or move a bookmark:
- Local Update (Immediate) — Change applied to IndexedDB instantly. UI updates.
- Queue Addition — A "job" is added to the SyncQueue, firing an internal event.
- Background Sync — The AutoSync listener picks up the event and syncs to Drive.
You never see a loading spinner for file operations. Organize 100 files in seconds, and Luma quietly handles syncing in the background.
Why a Queue?
Imagine moving a folder with 50 sub-items. Synchronous Drive sync would mean waiting for 50 API calls. With our queue:
- Offline Capable — Queue pauses when offline, resumes automatically
- Resilient — Failed jobs stay in queue for retry
- Non-Blocking — Main thread stays free for user interactions
Privacy by Design
Since sync happens directly between your browser and Google Drive, we never see your data. There's no "Luma Server" holding your bookmarks.
- Your Cloud — Data stored in a dedicated
LUMASYNCfolder in your Drive - Your Control — View, download, or delete anytime
- Zero Lock-in — Stop using Luma? Your data stays yours.
Inside the LUMASYNC Folder
Open your Drive and you'll find thousands of small files:
.folder— Represents a directory.page— Represents a bookmark
Each filename corresponds to its ID in our local database. Open one:
{
"id": "1234-5678",
"name": "My Cool Bookmark",
"url": "https://example.com",
"parentId": "root",
"type": "page"
}Plain JSON. Readable. Auditable.
Why No Encryption?
This was one of our oldest debates. We chose plain text for three reasons:
- Performance — Encryption overhead would degrade the "instant" feel
- Transparency — You can audit exactly what we're storing
- Data Ownership — Write scripts, migrate to other services, back up independently
Your Drive is already encrypted at rest by Google. We focus on speed.
Coming soon: The Vault — client-side encryption for sensitive items.
The Shallow Router → use-address-state
Luma uses the URL as the source of truth for navigation. Everything is right there in the address bar:
- Persistence — Reload the browser, you're exactly where you left off
- Navigation — Browser back/forward buttons work perfectly
But we hit a hurdle with Next.js's default router: every route change fetched data from the server. For a file explorer, this meant:
- Perceptible latency on every folder click
- Massively increased edge request costs
The Solution
We built The Shallow Router—a "fake" router that updates the URL without triggering server requests.
We patched the native window.history.pushState and replaceState methods to dispatch custom events, then built a subscription system that reacts to URL changes efficiently.
The result? Native speed. It feels like an OS-level file explorer but retains all the navigation capabilities of a modern browser.
This architecture worked so well that I extracted it into a standalone library: use-address-state. A lightweight React hook for managing state in URL search params, now available on npm for anyone to use.
Intelligent Search
Traditional search fails when you forget the name.
We've all done it: bookmark a page thinking "I'll need this later," then six months later remember the concept but not the title. Fuzzy search can't help if you don't remember the exact words.
Semantic AI Search
Our Intelligent Agent understands the meaning behind your query. Search for "coding tutorials for beginners" and it finds bookmarks labeled "React 101" or "Python Basics".
And it runs entirely locally in your browser.
We use the Xenova/all-MiniLM-L6-v2 model (8-bit quantized)—lightweight, fast, and private. Your queries never leave your device.
Hybrid Search Engine
AI alone isn't enough. Sometimes you do want exact name matching. We built a Hybrid Search:
- FlexSearch — Handles fuzzy matching and typos
- Semantic AI — Understands what you actually mean
Best of both worlds.
The Result
What started as "what if we stored AI chats?" became a fully-featured bookmark manager:
- Blazing Fast — Local-first architecture with instant navigation
- Privacy-First — Your data syncs to your Drive, not our servers
- Intelligent — AI search that understands intent, not just keywords
- Native Feel — Browser back/forward buttons, URL persistence, zero network lag
Built with love, coffee, and long discussions at Lagaan.
Try Luma
Chrome Web Store
Install the extension to start organizing your web life with folder-based bookmarks.
Luma Homepage
Explore the features, documentation, and the philosophy behind Luma.
Read the Blog
Deep dives into the flat-file architecture, performance optimizations, and design decisions.
Welcome to Luma.