-
-
Notifications
You must be signed in to change notification settings - Fork 1.1k
[Feature] Add Luma connector #339
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
[Feature] Add Luma connector
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
|
Warning Rate limit exceeded@MODSetter has exceeded the limit for the number of commits or files that can be reviewed per hour. Please wait 15 minutes and 51 seconds before requesting another review. ⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. 📒 Files selected for processing (2)
WalkthroughAdds full Luma integration: DB enums and migration, a Luma connector client, indexing pipeline and background tasks, search integration, researcher prompt/source mappings, connector CRUD and test routes, validation, and frontend pages/types/icons to add, edit, and use Luma sources. Changes
Sequence Diagram(s)sequenceDiagram
autonumber
actor U as User
participant W as Web UI
participant B as Backend API
participant DB as Database
participant LC as LumaConnector (client)
rect rgba(200,220,255,0.20)
note over U,W: Create Luma connector
U->>W: Open Add Luma page & submit {api_key, space_id}
W->>B: POST /connectors/luma/add
B->>DB: Upsert connector (user, type, config)
DB-->>B: OK
B-->>W: 200 Created/Updated
end
rect rgba(220,255,220,0.20)
note over U,W: Test connector
U->>W: Click "Test"
W->>B: GET /connectors/luma/test
B->>DB: Load connector config
B->>LC: get_user_info(), get_all_events(limit=10)
LC-->>B: user info, events
B-->>W: 200 {user, event_count}
end
rect rgba(255,245,200,0.20)
note over U,W: Trigger indexing
U->>W: Click "Index"
W->>B: POST /connectors/index {type=LUMA_CONNECTOR, dates}
B->>B: Schedule BackgroundTasks -> run_luma_indexing_with_new_session
B-->>W: 202 Accepted
par Background task
B->>DB: New session, load connector
B->>LC: get_events_by_date_range(start,end)
LC-->>B: events
loop per event
B->>DB: Persist Document, Chunks, Embeddings
end
B->>DB: Update last_indexed_at
end
end
sequenceDiagram
autonumber
actor U as User
participant W as Web UI
participant R as Researcher Agent
participant CS as ConnectorService
participant DR as Retriever(s)
U->>W: Search query
W->>R: Invoke researcher flow
R->>CS: search_luma(query, user_id, space_id, mode)
alt mode = CHUNKS
CS->>DR: chunk_retriever.hybrid_search(doc_type=LUMA_CONNECTOR)
DR-->>CS: chunks
else mode = DOCUMENTS
CS->>DR: document_retriever.hybrid_search(doc_type=LUMA_CONNECTOR)
DR-->>CS: documents
CS->>CS: _transform_document_results(...)
end
CS-->>R: {sources: [Luma entries], raw}
R-->>W: Stream "Found N Luma events" and results
Estimated code review effort🎯 4 (Complex) | ⏱️ ~75 minutes Possibly related PRs
Poem
Pre-merge checks and finishing touches✅ Passed checks (3 passed)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Review by RecurseML
🔍 Review performed on 3fec106..ef361e1
✨ No bugs found, your code is sparkling clean
✅ Files analyzed, no issues (5)
• surfsense_backend/app/connectors/luma_connector.py
• surfsense_backend/app/tasks/connector_indexers/luma_indexer.py
• surfsense_web/app/dashboard/[search_space_id]/connectors/add/luma-connector/page.tsx
• surfsense_backend/app/routes/luma_add_connector_route.py
• surfsense_backend/app/services/connector_service.py
⏭️ Files skipped (22)
| Locations |
|---|
README.md |
surfsense_backend/alembic/versions/21_add_luma_connector_enums.py |
surfsense_backend/app/agents/researcher/nodes.py |
surfsense_backend/app/agents/researcher/qna_agent/prompts.py |
surfsense_backend/app/agents/researcher/sub_section_writer/prompts.py |
surfsense_backend/app/agents/researcher/utils.py |
surfsense_backend/app/db.py |
surfsense_backend/app/routes/__init__.py |
surfsense_backend/app/routes/search_source_connectors_routes.py |
surfsense_backend/app/schemas/search_source_connector.py |
surfsense_backend/app/tasks/connector_indexers/__init__.py |
surfsense_web/app/dashboard/[search_space_id]/connectors/[connector_id]/edit/page.tsx |
surfsense_web/app/dashboard/[search_space_id]/connectors/[connector_id]/page.tsx |
surfsense_web/app/dashboard/[search_space_id]/connectors/add/page.tsx |
surfsense_web/components/dashboard-breadcrumb.tsx |
surfsense_web/components/editConnector/types.ts |
surfsense_web/contracts/enums/connector.ts |
surfsense_web/contracts/enums/connectorIcons.tsx |
surfsense_web/hooks/use-document-by-chunk.ts |
surfsense_web/hooks/use-documents.ts |
surfsense_web/hooks/useConnectorEditPage.ts |
surfsense_web/lib/connectors/utils.ts |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 17
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (27)
README.md(2 hunks)surfsense_backend/alembic/versions/21_add_luma_connector_enums.py(1 hunks)surfsense_backend/app/agents/researcher/nodes.py(3 hunks)surfsense_backend/app/agents/researcher/qna_agent/prompts.py(1 hunks)surfsense_backend/app/agents/researcher/sub_section_writer/prompts.py(1 hunks)surfsense_backend/app/agents/researcher/utils.py(2 hunks)surfsense_backend/app/connectors/luma_connector.py(1 hunks)surfsense_backend/app/db.py(2 hunks)surfsense_backend/app/routes/__init__.py(2 hunks)surfsense_backend/app/routes/luma_add_connector_route.py(1 hunks)surfsense_backend/app/routes/search_source_connectors_routes.py(5 hunks)surfsense_backend/app/schemas/search_source_connector.py(1 hunks)surfsense_backend/app/services/connector_service.py(1 hunks)surfsense_backend/app/tasks/connector_indexers/__init__.py(3 hunks)surfsense_backend/app/tasks/connector_indexers/luma_indexer.py(1 hunks)surfsense_web/app/dashboard/[search_space_id]/connectors/[connector_id]/edit/page.tsx(1 hunks)surfsense_web/app/dashboard/[search_space_id]/connectors/[connector_id]/page.tsx(2 hunks)surfsense_web/app/dashboard/[search_space_id]/connectors/add/luma-connector/page.tsx(1 hunks)surfsense_web/app/dashboard/[search_space_id]/connectors/add/page.tsx(1 hunks)surfsense_web/components/dashboard-breadcrumb.tsx(1 hunks)surfsense_web/components/editConnector/types.ts(1 hunks)surfsense_web/contracts/enums/connector.ts(1 hunks)surfsense_web/contracts/enums/connectorIcons.tsx(2 hunks)surfsense_web/hooks/use-document-by-chunk.ts(1 hunks)surfsense_web/hooks/use-documents.ts(1 hunks)surfsense_web/hooks/useConnectorEditPage.ts(4 hunks)surfsense_web/lib/connectors/utils.ts(1 hunks)
🧰 Additional context used
📓 Path-based instructions (2)
**/*.{jsx,tsx}
📄 CodeRabbit inference engine (.rules/require_unique_id_props.mdc)
**/*.{jsx,tsx}: When mapping arrays to React elements in JSX/TSX, each rendered element must include a unique key prop
Keys used for React list items should be stable, predictable, and unique among siblings
Files:
surfsense_web/app/dashboard/[search_space_id]/connectors/[connector_id]/edit/page.tsxsurfsense_web/app/dashboard/[search_space_id]/connectors/[connector_id]/page.tsxsurfsense_web/app/dashboard/[search_space_id]/connectors/add/page.tsxsurfsense_web/components/dashboard-breadcrumb.tsxsurfsense_web/app/dashboard/[search_space_id]/connectors/add/luma-connector/page.tsxsurfsense_web/contracts/enums/connectorIcons.tsx
**/{connector,search}_service.py
📄 CodeRabbit inference engine (.rules/avoid_source_deduplication.mdc)
Do not deduplicate sources when processing search results; preserve every chunk's unique source entry to maintain accurate citation tracking.
Files:
surfsense_backend/app/services/connector_service.py
🧬 Code graph analysis (9)
surfsense_backend/app/schemas/search_source_connector.py (1)
surfsense_backend/app/db.py (1)
SearchSourceConnectorType(55-70)
surfsense_backend/app/tasks/connector_indexers/__init__.py (1)
surfsense_backend/app/tasks/connector_indexers/luma_indexer.py (1)
index_luma_events(28-400)
surfsense_web/app/dashboard/[search_space_id]/connectors/[connector_id]/edit/page.tsx (1)
surfsense_web/components/editConnector/EditSimpleTokenForm.tsx (1)
EditSimpleTokenForm(24-49)
surfsense_web/app/dashboard/[search_space_id]/connectors/add/page.tsx (1)
surfsense_web/contracts/enums/connectorIcons.tsx (1)
getConnectorIcon(21-73)
surfsense_web/app/dashboard/[search_space_id]/connectors/add/luma-connector/page.tsx (3)
surfsense_web/hooks/useSearchSourceConnectors.ts (1)
useSearchSourceConnectors(24-355)surfsense_backend/app/db.py (1)
SearchSourceConnector(238-253)surfsense_web/contracts/enums/connectorIcons.tsx (1)
getConnectorIcon(21-73)
surfsense_backend/app/agents/researcher/nodes.py (2)
surfsense_backend/app/services/connector_service.py (1)
search_luma(1856-2014)surfsense_backend/app/services/streaming_service.py (1)
format_terminal_info_delta(28-47)
surfsense_backend/app/routes/luma_add_connector_route.py (2)
surfsense_backend/app/db.py (6)
BaseModel(135-139)SearchSourceConnector(238-253)SearchSourceConnectorType(55-70)User(301-334)User(338-368)get_async_session(409-411)surfsense_backend/app/connectors/luma_connector.py (3)
LumaConnector(14-379)get_user_info(96-107)get_all_events(109-145)
surfsense_backend/app/services/connector_service.py (3)
surfsense_backend/app/agents/researcher/configuration.py (1)
SearchMode(11-15)surfsense_backend/app/retriver/documents_hybrid_search.py (1)
hybrid_search(115-289)surfsense_backend/app/retriver/chunks_hybrid_search.py (1)
hybrid_search(115-266)
surfsense_backend/app/routes/search_source_connectors_routes.py (3)
surfsense_backend/app/tasks/connector_indexers/luma_indexer.py (1)
index_luma_events(28-400)surfsense_backend/app/db.py (1)
SearchSourceConnectorType(55-70)surfsense_backend/app/tasks/connector_indexers/base.py (1)
update_connector_last_indexed(124-139)
🪛 GitHub Actions: Code Quality Checks
surfsense_web/contracts/enums/connector.ts
[error] 16-16: Prettier formatting would modify content at this line. Run the formatter to fix formatting issues.
surfsense_web/lib/connectors/utils.ts
[error] 18-18: Prettier formatting would modify content at this line. Run the formatter to fix formatting issues.
surfsense_web/components/editConnector/types.ts
[error] 1-1: Biome formatting check failed. Formatter would have printed changes in this file.
surfsense_web/app/dashboard/[search_space_id]/connectors/[connector_id]/page.tsx
[error] 54-56: Prettier formatting would modify content in this block. Run the formatter (e.g., Prettier) to auto-fix formatting issues.
surfsense_web/hooks/useConnectorEditPage.ts
[error] 380-380: Using '==' to compare values. Use '===' to avoid type-coercion issues.
[error] 55-55: Prettier formatting would modify content at this line. Run the formatter to fix formatting issues.
surfsense_web/app/dashboard/[search_space_id]/connectors/add/page.tsx
[error] 149-149: Prettier formatting would modify content at this line. Run the formatter to fix formatting issues.
surfsense_web/components/dashboard-breadcrumb.tsx
[error] 90-90: Prettier formatting would modify content at this line. Run the formatter to fix formatting issues.
surfsense_web/app/dashboard/[search_space_id]/connectors/add/luma-connector/page.tsx
[error] 57-57: Prettier formatting would modify content at this line. Run the formatter to fix formatting issues.
[error] 72-74: Prettier formatting would modify content in this block. Run the formatter to fix formatting issues.
surfsense_web/contracts/enums/connectorIcons.tsx
[error] 1-1: Imports not sorted. Organize imports (Biome) and apply fix.
surfsense_backend/app/connectors/luma_connector.py
[error] 386-386: Detect secrets: Potential secrets about to be committed to git repo! Location: surfsense_backend/app/connectors/luma_connector.py:386. Consider removing or safeguarding secrets; see inline pragma or adjust pre-commit config.
🪛 GitHub Actions: pre-commit
surfsense_web/contracts/enums/connector.ts
[error] 16-16: Formatter would have printed the following content: LUMA_CONNECTOR = "LUMA_CONNECTOR",
[error] 16-16: Formatter would have printed the following content: LUMA_CONNECTOR = "LUMA_CONNECTOR",
surfsense_web/lib/connectors/utils.ts
[error] 18-18: Formatter would have printed the following content: LUMA_CONNECTOR: "Luma",
[error] 18-18: LUMA_CONNECTOR: "Luma"
surfsense_web/app/dashboard/[search_space_id]/connectors/[connector_id]/page.tsx
[error] 54-56: Formatter would have printed the following content: Ensure formatting consistency for connector type mapping.
surfsense_web/hooks/useConnectorEditPage.ts
[error] 380-380: lint/suspicious/noDoubleEquals: Using == may be unsafe; use === for strict comparison.
[error] 53-56: Formatter would have printed the following content: ... (format corrections)
surfsense_web/app/dashboard/[search_space_id]/connectors/add/page.tsx
[error] 149-149: Formatter would have printed the following content: },
surfsense_web/app/dashboard/[search_space_id]/connectors/add/luma-connector/page.tsx
[error] 55-55: Formatter would have printed the following content: const [isSubmitting, setIsSubmitting] = useState(false);
[error] 127-129: Formatter would have printed the following content:
Connect your Luma account to search events.
[error] 170-170: Formatter would have printed the following content: <Input type="password" placeholder="Enter your Luma API key" {...field} />
[error] 127-132: Formatter would have printed the following content:
surfsense_web/contracts/enums/connectorIcons.tsx
[error] 1-1: Assist/Source OrganizeImports: imports and exports are not sorted.
surfsense_backend/app/connectors/luma_connector.py
[error] 386-386: Detect secrets: Potential secrets about to be committed to git repo! Secret Type: Secret Keyword
🔇 Additional comments (5)
surfsense_backend/app/agents/researcher/sub_section_writer/prompts.py (1)
41-42: Addition looks good.The new Luma knowledge source matches the existing pattern and keeps the prompt list in sync with the connector additions.
surfsense_backend/app/agents/researcher/qna_agent/prompts.py (1)
41-42: Prompt updates are consistent.Both Q&A prompt variants now advertise the Luma source alongside the other connectors, keeping instructions aligned with the new integration.
Also applies to: 200-201
surfsense_backend/app/schemas/search_source_connector.py (1)
199-209: Validation branch aligns with existing pattern.The Luma connector config check mirrors the other API-key connectors and guards against empty keys, so the schema remains consistent.
surfsense_web/components/dashboard-breadcrumb.tsx (1)
78-91: Run Prettier to fix the new Luma entryThe pipeline is failing with
Prettier would modify this line, so the branch won’t merge until we match the formatter output. Please rerun the formatter (or add the trailing comma) so CI passes.tagraise_major_issue"serper-api": "Serper API", "linkup-api": "LinkUp API", - "luma-connector": "Luma" + "luma-connector": "Luma",surfsense_web/contracts/enums/connectorIcons.tsx (1)
1-55: Keep the Tabler icon imports sortedAssist/Biome is failing because
IconSparkleswasn’t placed in the sorted position. Please run the organizer (or reorder manually) so the import block stays sorted and CI goes green.tagraise_major_issueIconLinkPlus, IconMail, + IconSparkles, IconTable, IconTicket, - IconWorldWww, - IconSparkles, + IconWorldWww,
| elif doc_type == "LUMA_CONNECTOR": | ||
| # Extract Luma-specific metadata | ||
| event_id = metadata.get("event_id", "") | ||
| event_name = metadata.get("event_name", "Untitled Event") | ||
| event_url = metadata.get("event_url", "") | ||
| start_time = metadata.get("start_time", "") | ||
| location_name = metadata.get("location_name", "") | ||
| meeting_url = metadata.get("meeting_url", "") | ||
|
|
||
| title = f"Luma: {event_name}" | ||
| if start_time: | ||
| # Format the start time for display | ||
| try: | ||
| if "T" in start_time: | ||
| from datetime import datetime | ||
| start_dt = datetime.fromisoformat( | ||
| start_time.replace("Z", "+00:00") | ||
| ) | ||
| formatted_time = start_dt.strftime("%Y-%m-%d %H:%M") | ||
| title += f" ({formatted_time})" | ||
| except Exception: | ||
| pass | ||
|
|
||
| description = ( | ||
| doc.content[:100] + "..." | ||
| if len(doc.content) > 100 | ||
| else doc.content | ||
| ) | ||
| if location_name: | ||
| description += f" | Venue: {location_name}" | ||
| elif meeting_url: | ||
| description += f" | Online Event" | ||
|
|
||
| url = event_url if event_url else "" | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Align Luma metadata keys with what the indexer stores
The indexer saves Luma event metadata as start_at, end_at, and location (see index_luma_events), but this branch reads start_time and location_name. As a result, user-selected Luma documents won’t show their schedule or venue. Please switch to the stored keys (e.g., start_at/location) so the UI surfaces the right details.
🤖 Prompt for AI Agents
In surfsense_backend/app/agents/researcher/nodes.py around lines 416 to 450, the
LUMA_CONNECTOR branch reads metadata keys like "start_time" and "location_name"
but the indexer stores these as "start_at", "end_at" and "location"; update the
metadata lookups to use metadata.get("start_at", "") and
metadata.get("location", "") (and optionally metadata.get("end_at", "") if end
time is needed), rename local variables accordingly (e.g., start_at, location)
and apply the same ISO parsing/formatting logic to start_at so event schedule
and venue render correctly in the UI.
| response = requests.get(url, headers=headers, params=params) | ||
|
|
||
| if response.status_code == 200: | ||
| return response.json() | ||
| elif response.status_code == 401: | ||
| raise Exception("Unauthorized: Invalid Luma API key") | ||
| elif response.status_code == 403: | ||
| raise Exception("Forbidden: Access denied or Luma Plus subscription required") | ||
| elif response.status_code == 429: | ||
| raise Exception("Rate limit exceeded: Too many requests") | ||
| else: | ||
| raise Exception( | ||
| f"API request failed with status code {response.status_code}: {response.text}" | ||
| ) | ||
|
|
||
| except requests.exceptions.RequestException as e: | ||
| raise Exception(f"Network error: {e}") from e |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add a timeout to all outbound Luma API calls.
requests.get without a timeout can hang this worker indefinitely if Luma stalls or the network drops, tying up the connector thread and backlogging the job queue. Please pass an explicit, bounded timeout (and consider making it a constant for reuse) when hitting external services.
- response = requests.get(url, headers=headers, params=params)
+ response = requests.get(
+ url,
+ headers=headers,
+ params=params,
+ timeout=10,
+ )📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| response = requests.get(url, headers=headers, params=params) | |
| if response.status_code == 200: | |
| return response.json() | |
| elif response.status_code == 401: | |
| raise Exception("Unauthorized: Invalid Luma API key") | |
| elif response.status_code == 403: | |
| raise Exception("Forbidden: Access denied or Luma Plus subscription required") | |
| elif response.status_code == 429: | |
| raise Exception("Rate limit exceeded: Too many requests") | |
| else: | |
| raise Exception( | |
| f"API request failed with status code {response.status_code}: {response.text}" | |
| ) | |
| except requests.exceptions.RequestException as e: | |
| raise Exception(f"Network error: {e}") from e | |
| # Add a timeout to avoid hanging if Luma stalls | |
| response = requests.get( | |
| url, | |
| headers=headers, | |
| params=params, | |
| timeout=10, | |
| ) | |
| if response.status_code == 200: | |
| return response.json() |
🤖 Prompt for AI Agents
In surfsense_backend/app/connectors/luma_connector.py around lines 78 to 94,
requests.get is called without a timeout; add a module-level constant (e.g.,
LUMA_API_TIMEOUT = 10) and pass it as the timeout parameter to requests.get
(timeout=LUMA_API_TIMEOUT) so outbound calls are bounded; also catch
requests.exceptions.Timeout separately (or let it propagate as a
RequestException) to provide a clearer error message in the except block and
reuse the constant for any other external Luma calls in this module.
| # Example usage (uncomment to use): | ||
| """ | ||
| if __name__ == "__main__": | ||
| # Set your API key here | ||
| api_key = "YOUR_LUMA_API_KEY" | ||
| luma = LumaConnector(api_key) | ||
| try: | ||
| # Test authentication | ||
| user_info, error = luma.get_user_info() | ||
| if error: | ||
| print(f"Authentication error: {error}") | ||
| else: | ||
| print(f"Authenticated as: {user_info.get('name', 'Unknown')}") | ||
| # Get all events | ||
| events, error = luma.get_all_events() | ||
| if error: | ||
| print(f"Error fetching events: {error}") | ||
| else: | ||
| print(f"Retrieved {len(events)} events") | ||
| # Format and print the first event as markdown | ||
| if events: | ||
| event_md = luma.format_event_to_markdown(events[0]) | ||
| print("\nSample Event in Markdown:\n") | ||
| print(event_md) | ||
| # Get events by date range | ||
| start_date = "2023-01-01" | ||
| end_date = "2023-01-31" | ||
| date_events, error = luma.get_events_by_date_range(start_date, end_date) | ||
| if error: | ||
| print(f"Error: {error}") | ||
| else: | ||
| print(f"\nRetrieved {len(date_events)} events from {start_date} to {end_date}") | ||
| except Exception as e: | ||
| print(f"Error: {e}") | ||
| """ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Resolve the detect-secrets failure in the example block.
CI is blocking this PR because the placeholder string looks like a real secret. Either drop the inline example from the module or rewrite the placeholder without the API_KEY pattern so the detector no longer flags it (e.g., reference an environment variable in the example instead). This needs to be cleared for the pipeline to pass.
🧰 Tools
🪛 GitHub Actions: Code Quality Checks
[error] 386-386: Detect secrets: Potential secrets about to be committed to git repo! Location: surfsense_backend/app/connectors/luma_connector.py:386. Consider removing or safeguarding secrets; see inline pragma or adjust pre-commit config.
🪛 GitHub Actions: pre-commit
[error] 386-386: Detect secrets: Potential secrets about to be committed to git repo! Secret Type: Secret Keyword
🤖 Prompt for AI Agents
In surfsense_backend/app/connectors/luma_connector.py around lines 382 to 423,
the example block contains a placeholder literal that matches secret patterns
(e.g., "YOUR_LUMA_API_KEY") and causes detect-secrets to fail; remove the
hardcoded placeholder or replace it with a non-secret reference (for example,
show using an environment variable or a clearly non-secret dummy like
"example-key" or use os.environ.get('LUMA_API_KEY')) and/or delete the entire
runnable example block so no API-like literal remains; ensure no strings in that
block resemble API keys so the CI detector stops flagging it.
| api_key = connector.config.get("LUMA_API_KEY") | ||
|
|
||
| if not api_key: | ||
| await task_logger.log_task_failure( | ||
| log_entry, | ||
| f"Luma API key not found in connector config for connector {connector_id}", | ||
| "Missing Luma API key", | ||
| {"error_type": "MissingCredentials"}, | ||
| ) | ||
| return 0, "Luma API key not found in connector config" | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Critical: Fix Luma API key lookup
connector.config currently stores the key under "api_key" (see add_luma_connector in surfsense_backend/app/routes/luma_add_connector_route.py), so this lookup always returns None. The very next block logs a failure and aborts indexing, meaning every indexing run hard-stops with “Luma API key not found”. Please read the stored key (e.g., connector.config.get("api_key"), optionally falling back to "LUMA_API_KEY") so indexing can actually proceed.
🤖 Prompt for AI Agents
In surfsense_backend/app/tasks/connector_indexers/luma_indexer.py around lines
90 to 100, the code looks up the Luma API key using
connector.config.get("LUMA_API_KEY") but the key is actually stored as
"api_key", causing lookup to always fail and abort indexing; change the lookup
to first try connector.config.get("api_key") and fall back to
connector.config.get("LUMA_API_KEY") if needed, then use that value for the
subsequent missing-key check/log so valid connectors won't be incorrectly
treated as missing credentials.
| logger.info(f"Retrieved {len(events)} events from Luma API") | ||
|
|
||
| except Exception as e: | ||
| logger.error(f"Error fetching Luma events: {e!s}", exc_info=True) | ||
| return 0, f"Error fetching Luma events: {e!s}" | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ensure task log is marked failed on fetch errors
When get_events_by_date_range raises (network/API issues, auth problems, etc.), we just log and return. The task’s Log entry remains IN_PROGRESS because we never call log_task_failure, so operators never see the failure status. Please invoke await task_logger.log_task_failure(...) before returning the error.
🤖 Prompt for AI Agents
In surfsense_backend/app/tasks/connector_indexers/luma_indexer.py around lines
207 to 212, the exception handler logs and returns on fetch errors but never
marks the task as failed; update the except block to await
task_logger.log_task_failure(...) with a clear failure message (include the
exception string) and any relevant metadata before returning so the Task Log
status is set to FAILED, then keep the existing logger.error and return values.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (16)
surfsense_backend/app/agents/researcher/nodes.py(3 hunks)surfsense_backend/app/connectors/luma_connector.py(1 hunks)surfsense_backend/app/routes/__init__.py(2 hunks)surfsense_backend/app/routes/luma_add_connector_route.py(1 hunks)surfsense_backend/app/routes/search_source_connectors_routes.py(5 hunks)surfsense_backend/app/services/connector_service.py(1 hunks)surfsense_backend/app/tasks/connector_indexers/__init__.py(3 hunks)surfsense_backend/app/tasks/connector_indexers/luma_indexer.py(1 hunks)surfsense_web/app/dashboard/[search_space_id]/connectors/[connector_id]/page.tsx(2 hunks)surfsense_web/app/dashboard/[search_space_id]/connectors/add/luma-connector/page.tsx(1 hunks)surfsense_web/app/dashboard/[search_space_id]/connectors/add/page.tsx(1 hunks)surfsense_web/components/editConnector/types.ts(1 hunks)surfsense_web/contracts/enums/connector.ts(1 hunks)surfsense_web/contracts/enums/connectorIcons.tsx(2 hunks)surfsense_web/hooks/useConnectorEditPage.ts(4 hunks)surfsense_web/lib/connectors/utils.ts(1 hunks)
✅ Files skipped from review due to trivial changes (1)
- surfsense_web/app/dashboard/[search_space_id]/connectors/add/page.tsx
🚧 Files skipped from review as they are similar to previous changes (5)
- surfsense_web/lib/connectors/utils.ts
- surfsense_web/contracts/enums/connector.ts
- surfsense_backend/app/routes/init.py
- surfsense_web/app/dashboard/[search_space_id]/connectors/[connector_id]/page.tsx
- surfsense_web/components/editConnector/types.ts
🧰 Additional context used
📓 Path-based instructions (2)
**/*.{jsx,tsx}
📄 CodeRabbit inference engine (.rules/require_unique_id_props.mdc)
**/*.{jsx,tsx}: When mapping arrays to React elements in JSX/TSX, each rendered element must include a unique key prop
Keys used for React list items should be stable, predictable, and unique among siblings
Files:
surfsense_web/contracts/enums/connectorIcons.tsxsurfsense_web/app/dashboard/[search_space_id]/connectors/add/luma-connector/page.tsx
**/{connector,search}_service.py
📄 CodeRabbit inference engine (.rules/avoid_source_deduplication.mdc)
Do not deduplicate sources when processing search results; preserve every chunk's unique source entry to maintain accurate citation tracking.
Files:
surfsense_backend/app/services/connector_service.py
🧠 Learnings (1)
📚 Learning: 2025-08-11T18:17:37.635Z
Learnt from: CR
PR: MODSetter/SurfSense#0
File: .rules/avoid_source_deduplication.mdc:0-0
Timestamp: 2025-08-11T18:17:37.635Z
Learning: Applies to **/{connector,search}_service.py : Do not deduplicate sources when processing search results; preserve every chunk's unique source entry to maintain accurate citation tracking.
Applied to files:
surfsense_backend/app/agents/researcher/nodes.py
🧬 Code graph analysis (6)
surfsense_backend/app/tasks/connector_indexers/__init__.py (1)
surfsense_backend/app/tasks/connector_indexers/luma_indexer.py (1)
index_luma_events(28-401)
surfsense_backend/app/routes/search_source_connectors_routes.py (3)
surfsense_backend/app/tasks/connector_indexers/luma_indexer.py (1)
index_luma_events(28-401)surfsense_backend/app/db.py (1)
SearchSourceConnectorType(55-70)surfsense_backend/app/tasks/connector_indexers/base.py (1)
update_connector_last_indexed(124-139)
surfsense_backend/app/routes/luma_add_connector_route.py (3)
surfsense_backend/app/db.py (6)
BaseModel(135-139)SearchSourceConnector(238-253)SearchSourceConnectorType(55-70)User(301-334)User(338-368)get_async_session(409-411)surfsense_web/lib/api.ts (3)
post(104-121)delete(157-169)get(82-94)surfsense_backend/app/connectors/luma_connector.py (3)
LumaConnector(14-391)get_user_info(98-109)get_all_events(111-149)
surfsense_backend/app/services/connector_service.py (3)
surfsense_backend/app/agents/researcher/configuration.py (1)
SearchMode(11-15)surfsense_backend/app/retriver/documents_hybrid_search.py (1)
hybrid_search(115-289)surfsense_backend/app/retriver/chunks_hybrid_search.py (1)
hybrid_search(115-266)
surfsense_backend/app/agents/researcher/nodes.py (2)
surfsense_backend/app/services/connector_service.py (1)
search_luma(1856-2015)surfsense_backend/app/services/streaming_service.py (1)
format_terminal_info_delta(28-47)
surfsense_web/app/dashboard/[search_space_id]/connectors/add/luma-connector/page.tsx (3)
surfsense_web/hooks/useSearchSourceConnectors.ts (1)
useSearchSourceConnectors(24-355)surfsense_backend/app/db.py (1)
SearchSourceConnector(238-253)surfsense_web/contracts/enums/connectorIcons.tsx (1)
getConnectorIcon(21-73)
🪛 GitHub Actions: pre-commit
surfsense_backend/app/connectors/luma_connector.py
[error] 398-398: Detect secrets: Potential secrets about to be committed to git repo! Secret Type: Secret Keyword
surfsense_web/hooks/useConnectorEditPage.ts
[error] 380-381: lint/suspicious/noDoubleEquals - Using == may be unsafe; use === instead.
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
- GitHub Check: Python Backend Quality
🔇 Additional comments (9)
surfsense_web/app/dashboard/[search_space_id]/connectors/add/luma-connector/page.tsx (1)
70-77: Guard againstfetchConnectors()returning undefined before calling.find.
fetchConnectors()returnsundefinedon failures, sodata.find(...)will throw and crash the page whenever the initial fetch fails (network/auth errors, etc.). Add a defensive check before using the array so we silently bail instead of exploding.- fetchConnectors().then((data) => { - const connector = data.find( + fetchConnectors().then((data) => { + if (!Array.isArray(data)) { + return; + } + const connector = data.find( (c: SearchSourceConnector) => c.connector_type === EnumConnectorName.LUMA_CONNECTOR );surfsense_web/hooks/useConnectorEditPage.ts (1)
380-381: Use strict equality for the Luma connector guard.
==is still here, so lint keeps failing and the check allows coercion. Switch to===just like the other connector checks.- } else if (connector.connector_type == "LUMA_CONNECTOR") { + } else if (connector.connector_type === "LUMA_CONNECTOR") {surfsense_backend/app/agents/researcher/nodes.py (1)
416-451: Align Luma metadata keys with what the indexer stores.
index_luma_eventswritesstart_at,end_at,location,city, etc. intodocument_metadata, but this branch still readsstart_timeandlocation_name, so titles/descriptions for Luma selections lose their schedule and venue. Pull the stored keys instead.- start_time = metadata.get("start_time", "") - location_name = metadata.get("location_name", "") - meeting_url = metadata.get("meeting_url", "") + start_at = metadata.get("start_at", "") + end_at = metadata.get("end_at", "") + location = metadata.get("location", "") + city = metadata.get("city", "") event_url = metadata.get("event_url", "") title = f"Luma: {event_name}" - if start_time: + if start_at: try: - if "T" in start_time: + if "T" in start_at: from datetime import datetime - start_dt = datetime.fromisoformat( - start_time.replace("Z", "+00:00") - ) + start_dt = datetime.fromisoformat(start_at.replace("Z", "+00:00")) formatted_time = start_dt.strftime("%Y-%m-%d %H:%M") title += f" ({formatted_time})" - except Exception: - pass + except Exception: + title += f" ({start_at})" description = ( doc.content[:100] + "..." if len(doc.content) > 100 else doc.content ) - if location_name: - description += f" | Venue: {location_name}" - elif meeting_url: - description += " | Online Event" + if location: + description += f" | Location: {location}" + elif city: + description += f" | City: {city}" + if end_at: + description += f" | Ends: {end_at}"This makes the researcher view render the same metadata the Luma indexer actually records.
surfsense_backend/app/tasks/connector_indexers/luma_indexer.py (4)
92-103: Fix API key lookup
Line [93]:connector.configstores the credential under"api_key"(see the add connector route), so this lookup always returnsNoneand every indexing run aborts. Please read the stored key (falling back to the legacy"LUMA_API_KEY"only if needed) so valid credentials actually work.- api_key = connector.config.get("LUMA_API_KEY") + api_key = connector.config.get("api_key") or connector.config.get( + "LUMA_API_KEY" + )
208-211: Mark task log as FAILED on fetch exceptions
Line [209]: Whenget_events_by_date_rangeraises, we log and return but never flip the task log to FAILED, leaving the log stuck IN_PROGRESS. Please awaittask_logger.log_task_failure(...)(including the exception text) before returning so operators see the failure state.except Exception as e: - logger.error(f"Error fetching Luma events: {e!s}", exc_info=True) - return 0, f"Error fetching Luma events: {e!s}" + await task_logger.log_task_failure( + log_entry, + f"Error fetching Luma events for connector {connector_id}", + str(e), + {"error_type": "FetchError"}, + ) + logger.error(f"Error fetching Luma events: {e!s}", exc_info=True) + return 0, f"Error fetching Luma events: {e!s}"
239-339: Align stored metadata with search expectations
Line [327]:search_lumareadsstart_time,end_time,location_name,location_address,meeting_url, andvisibility, but we persiststart_at,location, etc., so the UI shows empty fields. Please populate the metadata with the keys the search layer consumes (you can keep the old keys for compatibility).- location = geo_info.get("address", "") - city = geo_info.get("city", "") + location_name = geo_info.get("name", "") + location_address = geo_info.get("address", "") + city = geo_info.get("city", "") + meeting_url = event_data.get("meeting_url", "") + visibility = event_data.get("visibility", "") + location = location_address @@ - "start_at": start_at, - "end_at": end_at, - "timezone": timezone, - "location": location or "No location", - "city": city, + "start_time": start_at, + "end_time": end_at, + "start_at": start_at, + "end_at": end_at, + "timezone": timezone, + "location_name": location_name, + "location_address": location_address, + "location": location or "No location", + "city": city, + "meeting_url": meeting_url, + "visibility": visibility, @@ - "location": location, - "city": city, - "hosts": host_names, + "location_name": location_name, + "location_address": location_address, + "location": location, + "city": city, + "hosts": host_names, + "meeting_url": meeting_url, + "visibility": visibility,
360-364: Update last_indexed_at after successful skip-only runs
Line [361]:total_processedignoresdocuments_skipped, so a run that only deduplicates never bumpslast_indexed_at, causing endless re-fetch loops. Count both indexed and skipped events (or just checkdocuments_indexed + documents_skipped) before updating the timestamp.- total_processed = documents_indexed - if total_processed > 0: + total_processed = documents_indexed + documents_skipped + if total_processed > 0: await update_connector_last_indexed(session, connector, update_last_indexed)surfsense_backend/app/connectors/luma_connector.py (2)
78-96: Bound outbound Luma calls with a timeout.
requests.getis still issued without a timeout, so a stalled Luma endpoint can hang the connector task indefinitely. Please add a module-level timeout constant and pass it to every outbound Luma request; optionally surfacerequests.exceptions.Timeoutwith a clearer message in the except block.-from datetime import datetime -from typing import Any - -import requests +from datetime import datetime +from typing import Any + +import requests + +LUMA_API_TIMEOUT = 10 @@ - response = requests.get(url, headers=headers, params=params) + response = requests.get( + url, + headers=headers, + params=params, + timeout=LUMA_API_TIMEOUT, + )
394-435: Fix detect-secrets false positive in example block.CI is still failing because the placeholder string
YOUR_LUMA_API_KEYmatches secret heuristics. Please drop or rewrite this example so no literal looks like a real credential (e.g., reference an env var or remove the snippet entirely).-# Example usage (uncomment to use): -""" -... - api_key = "YOUR_LUMA_API_KEY" -... -"""
| # Update existing connector with new API key | ||
| existing_connector.config = {"api_key": request.api_key} | ||
| existing_connector.is_indexable = True | ||
| await session.commit() | ||
| await session.refresh(existing_connector) | ||
|
|
||
| logger.info(f"Updated existing Luma connector for user {user.id}") | ||
|
|
||
| return { | ||
| "message": "Luma connector updated successfully", | ||
| "connector_id": existing_connector.id, | ||
| "connector_type": "LUMA_CONNECTOR", | ||
| } | ||
|
|
||
| # Create new Luma connector | ||
| db_connector = SearchSourceConnector( | ||
| name="Luma Event Connector", | ||
| connector_type=SearchSourceConnectorType.LUMA_CONNECTOR, | ||
| config={"api_key": request.api_key}, | ||
| user_id=user.id, | ||
| is_indexable=True, | ||
| ) | ||
|
|
||
| session.add(db_connector) | ||
| await session.commit() | ||
| await session.refresh(db_connector) | ||
|
|
||
| logger.info( | ||
| f"Successfully created Luma connector for user {user.id} with ID {db_connector.id}" | ||
| ) | ||
|
|
||
| return { | ||
| "message": "Luma connector added successfully", | ||
| "connector_id": db_connector.id, | ||
| "connector_type": "LUMA_CONNECTOR", | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Persist the requested search space
Line [80]: The request carries space_id, but we drop it on both update and create paths. Without storing the selected search space, downstream indexing can’t tell where to write Luma documents, and subsequent updates wipe any existing value. Please persist space_id (e.g., alongside the API key in config, or via the dedicated column if one exists).
- existing_connector.config = {"api_key": request.api_key}
+ existing_connector.config = {
+ **existing_connector.config,
+ "api_key": request.api_key,
+ "search_space_id": request.space_id,
+ }
@@
- config={"api_key": request.api_key},
+ config={
+ "api_key": request.api_key,
+ "search_space_id": request.space_id,
+ },📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| # Update existing connector with new API key | |
| existing_connector.config = {"api_key": request.api_key} | |
| existing_connector.is_indexable = True | |
| await session.commit() | |
| await session.refresh(existing_connector) | |
| logger.info(f"Updated existing Luma connector for user {user.id}") | |
| return { | |
| "message": "Luma connector updated successfully", | |
| "connector_id": existing_connector.id, | |
| "connector_type": "LUMA_CONNECTOR", | |
| } | |
| # Create new Luma connector | |
| db_connector = SearchSourceConnector( | |
| name="Luma Event Connector", | |
| connector_type=SearchSourceConnectorType.LUMA_CONNECTOR, | |
| config={"api_key": request.api_key}, | |
| user_id=user.id, | |
| is_indexable=True, | |
| ) | |
| session.add(db_connector) | |
| await session.commit() | |
| await session.refresh(db_connector) | |
| logger.info( | |
| f"Successfully created Luma connector for user {user.id} with ID {db_connector.id}" | |
| ) | |
| return { | |
| "message": "Luma connector added successfully", | |
| "connector_id": db_connector.id, | |
| "connector_type": "LUMA_CONNECTOR", | |
| } | |
| # Update existing connector with new API key | |
| existing_connector.config = { | |
| **existing_connector.config, | |
| "api_key": request.api_key, | |
| "search_space_id": request.space_id, | |
| } | |
| existing_connector.is_indexable = True | |
| await session.commit() | |
| await session.refresh(existing_connector) | |
| logger.info(f"Updated existing Luma connector for user {user.id}") | |
| return { | |
| "message": "Luma connector updated successfully", | |
| "connector_id": existing_connector.id, | |
| "connector_type": "LUMA_CONNECTOR", | |
| } | |
| # Create new Luma connector | |
| db_connector = SearchSourceConnector( | |
| name="Luma Event Connector", | |
| connector_type=SearchSourceConnectorType.LUMA_CONNECTOR, | |
| config={ | |
| "api_key": request.api_key, | |
| "search_space_id": request.space_id, | |
| }, | |
| user_id=user.id, | |
| is_indexable=True, | |
| ) | |
| session.add(db_connector) | |
| await session.commit() | |
| await session.refresh(db_connector) | |
| logger.info( | |
| f"Successfully created Luma connector for user {user.id} with ID {db_connector.id}" | |
| ) | |
| return { | |
| "message": "Luma connector added successfully", | |
| "connector_id": db_connector.id, | |
| "connector_type": "LUMA_CONNECTOR", | |
| } |
[Feature] Add Luma connector
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Review by RecurseML
🔍 Review performed on ef361e1..3dc8f7f
✨ No bugs found, your code is sparkling clean
✅ Files analyzed, no issues (17)
• surfsense_backend/app/agents/researcher/nodes.py
• surfsense_backend/app/connectors/luma_connector.py
• surfsense_backend/app/routes/__init__.py
• surfsense_backend/app/routes/luma_add_connector_route.py
• surfsense_backend/app/routes/search_source_connectors_routes.py
• surfsense_backend/app/services/connector_service.py
• surfsense_backend/app/tasks/connector_indexers/__init__.py
• surfsense_backend/app/tasks/connector_indexers/luma_indexer.py
• surfsense_web/app/dashboard/[search_space_id]/connectors/[connector_id]/page.tsx
• surfsense_web/app/dashboard/[search_space_id]/connectors/add/luma-connector/page.tsx
• surfsense_web/app/dashboard/[search_space_id]/connectors/add/page.tsx
• surfsense_web/components/dashboard-breadcrumb.tsx
• surfsense_web/components/editConnector/types.ts
• surfsense_web/contracts/enums/connector.ts
• surfsense_web/contracts/enums/connectorIcons.tsx
• surfsense_web/hooks/useConnectorEditPage.ts
• surfsense_web/lib/connectors/utils.ts
Description
[Feature] Add Luma connector by https://github.com/samkul-swe
Motivation and Context
Users who organize events on Luma couldn't search their event data in SurfSense. This fixes that by letting them connect their Luma account and search through all their events.
Created for Feature Request #330
Changes Overview
Screenshots
API Changes
Types of changes
Testing
Tested:
Checklist:
Notes
High-level PR Summary
This PR adds a new Luma connector to SurfSense, allowing users to search events from their Luma accounts. The implementation includes backend components for API integration, database changes to support the new connector type, and frontend UI for users to enter their Luma API key. The connector fetches event data from Luma's API, processes it, and stores it in the SurfSense database for searching. It handles event metadata like date, location, and description. The UI includes dedicated pages for adding and editing Luma connectors, with appropriate form validation. The changes are well-structured and follow the existing patterns used for other connectors in the system.
⏱️ Estimated Review Time: 5-15 minutes
💡 Review Order Suggestion
surfsense_backend/app/connectors/luma_connector.pysurfsense_backend/app/db.pysurfsense_backend/alembic/versions/21_add_luma_connector_enums.pysurfsense_backend/app/routes/luma_add_connector_route.pysurfsense_backend/app/tasks/connector_indexers/luma_indexer.pysurfsense_backend/app/services/connector_service.pysurfsense_backend/app/routes/search_source_connectors_routes.pysurfsense_web/app/dashboard/[search_space_id]/connectors/add/luma-connector/page.tsxsurfsense_backend/app/agents/researcher/nodes.pysurfsense_backend/app/agents/researcher/utils.pyREADME.mdSummary by CodeRabbit
New Features
Documentation