Opening the Black Box: API Tokens and GPT Connectors
The moment an app becomes useful to a few people, someone will inevitably ask: “Is there an API?”
For the longest time, the answer was no. Adding an API surface means committing to backward compatibility, rate limits, and securing a whole new vector of attack. But as the platform grew, the need for external integrations—specifically AI and GPT connectors—became impossible to ignore.
Here is what we shipped in v1.5 and the subsequent patches.
Developer API Tokens and GPT Bridge
We introduced a dedicated Developer API token management page. You can now generate tokens, monitor usage controls, and connect external tools.
The biggest addition here is the GPT connector auth bridge flow. We wanted users to be able to talk to their pet’s health records using AI, without handing over their primary login credentials. The new bridge flow establishes a secure, scoped connection between the user’s data and approved GPT implementations. We even automated the GPT schema extraction to ensure our OpenAPI specs always match runtime behavior.
Premium Roles and Storage Limits
We also had to face reality: server space isn’t free.
We added storage usage limits and enforcement, making it visible right in the user profile. To support the heavy users (those with multiple pets and years of medical records), we formalized a Premium User Role. Accompanied by a shiny new Premium avatar badge, this role isn’t about arbitrary paywalls—it’s strictly about covering the costs of resources like storage.
If our philosophy says “Care is the point,” then ensuring the servers stay online for everyone is the ultimate act of care.
Security Tightening
- Raised the minimum password length from 8 to 10 characters.
- Extracted and refactored the auth pages into a clean
AuthPageLayout. - Handled the nightmare that is in-app browsers restricting push notifications.
Building public APIs as a solo developer is terrifying. But seeing the platform become extensible makes it worth it.