|
Good question. Let me answer it the only way that makes sense. class AnujithBalan:
"""
Not your average developer.
Allergic to boring. Obsessed with privacy.
Currently: making AI smarter without
making humans more vulnerable.
"""
name = "Anujith Balan"
from_ = "Kerala, India 🇮🇳"
building = ["Full Stack Apps", "ML Projects"]
heading_to = "Federated Learning 🔐"
learning = ["Docker 🐳", "Supabase ⚡", "AWS ☁️"]
open_to = "Collabs, ideas, random DMs" # True
fun_fact = "Dark mode > light mode. Always."
# (light attracts bugs. this is not a joke.)
def philosophy(self):
return "The AI should come to your data. \
Not the other way around." |
|
You've probably heard: "AI needs your data to learn." What if that was never actually true?
|
|
The short version: Your data stays on your device. The AI learns from everyone, everywhere — without seeing anyone's actual data. It's how Google Keyboard learns your typing style without Google ever reading your messages.
Pretty wild, right? That's the rabbit hole I'm going down. 🐇
I'm into collabs, interesting problems, and conversations that start with "hear me out..."
Ask me about → Node.js · React · Firebase · ML · or why Federated Learning is lowkey the most important idea in AI right now


