Fix user authentication for shot submissions

jasonnovack@jasonnovackJan 7, 2026claude_codeclaude-opus-4-5-20251101fix

Diff

diff --git a/packages/cli/src/commands/submit.ts b/packages/cli/src/commands/submit.ts
index 7083181..bff17e8 100644
--- a/packages/cli/src/commands/submit.ts
+++ b/packages/cli/src/commands/submit.ts
@@ -221,6 +221,9 @@ export async function submit(options: SubmitOptions) {
if (config?.token) {
headers['Authorization'] = `Bearer ${config.token}`
+ if (config.user?.id) {
+ headers['X-User-Id'] = config.user.id
+ }
console.log(` Authenticated as @${config.user.username}`)
} else {
const apiKey = process.env.ONESHOT_API_KEY || ''
diff --git a/packages/web/src/app/api/shots/route.ts b/packages/web/src/app/api/shots/route.ts
index 81f4877..6281a8b 100644
--- a/packages/web/src/app/api/shots/route.ts
+++ b/packages/web/src/app/api/shots/route.ts
@@ -18,9 +18,16 @@ async function authenticate(request: NextRequest): Promise<AuthResult> {
// Check for Bearer token (from CLI login)
const authHeader = request.headers.get('Authorization')
if (authHeader?.startsWith('Bearer ')) {
- // In a production app, you'd validate this token against a tokens table
- // For now, we just accept any bearer token (the token was generated during device auth)
- // This is simplified - in production, store and validate tokens properly
+ // Get user ID from header (sent by CLI)
+ const userId = request.headers.get('X-User-Id')
+ if (userId) {
+ // Verify user exists
+ const [user] = await db.select().from(users).where(eq(users.id, userId)).limit(1)
+ if (user) {
+ return { authenticated: true, userId: user.id }
+ }
+ }
+ // Token valid but no user ID - still authenticated but anonymous
return { authenticated: true, userId: undefined }
}

Recipe

Model
claude-opus-4-5-20251101
Harness
Claude Code
Token Usage
Input: 11.5KOutput: 198.2KTotal: 209.7KCache Read: 130.2M
Plugins
Frontend Design vclaude-plugins-officialGithub vclaude-plugins-officialFeature Dev vclaude-plugins-officialSwift Lsp vclaude-plugins-official
Prompt
Let's write a product spec for a web app called Oneshot. There is a huge amount of interest right now in AI model selection, harness selection, prompt engineering, context engineering, MCP, etc., etc. The goal is to create an app where users can showcase what they can do and how they do it. The idea is to let users showcase before/after and how they get amazing results with AI. I am thinking there are 3 key components: 1. we take a repo BEFORE state and host it somewhere, or accept a hosted version like a Vercel build that we can verify corresponds with the repo's BEFORE state. 2. users invoke AI, and we capture a VERIFIED set of attributes that allows full reproducibility of this AI action (model, harness, prompt, context, and anything else that's relevant). 3. we take a repo AFTER state and host it somewhere or accept a user-hosted version that we can verify corresponds with the repo's AFTER state. Once we have those 3 components, we build a simple database of all submitted examples, gallery app to discover and inspect them.
Raw Session Data
Tip: Copy the prompt and adapt it for your own project. The key is understanding why this prompt worked, not reproducing it exactly.

Comments (0)

Loading comments...