Conversation
- removing all transformersjs tests, except 'feature-extraction'
|
No actionable comments were generated in the recent review. 🎉 ℹ️ Recent review infoConfiguration used: Central YAML (base), Organization UI (inherited) Review profile: CHILL Plan: Pro Cache: Disabled due to Reviews > Disable Cache setting Disabled knowledge base sources:
📒 Files selected for processing (49)
✅ Files skipped from review due to trivial changes (30)
🚧 Files skipped from review as they are similar to previous changes (8)
📝 WalkthroughSummary by CodeRabbit
WalkthroughThe pull request removes extensive transformer test cases with cache variants and snapshot validation logic from the test suite, while introducing a new set of example applications and a worker-based request orchestration system. The changes add AI model caching support to CI/CD workflows, replace snapshot-based test validation with example implementations using sample inputs and outputs, and implement a new HTTP server in Sequence Diagram(s)sequenceDiagram
participant Client
participant MainServer as Main Server<br/>(main/index.ts)
participant EdgeRuntime as EdgeRuntime<br/>User Workers
participant Worker as Model Worker<br/>(Service)
Client->>MainServer: HTTP Request
MainServer->>MainServer: Parse service_name<br/>from pathname
alt Missing service_name
MainServer->>Client: 400 JSON Error
else Valid service_name
MainServer->>MainServer: Compute servicePath<br/>and create worker config<br/>(memoryLimitMb: 1500,<br/>workerTimeoutMs: 600000,<br/>cpuTimeLimits)
MainServer->>EdgeRuntime: Create worker with<br/>resource constraints
alt Worker creation fails
EdgeRuntime-->>MainServer: Error
MainServer->>Client: 500 JSON Error
else Worker created successfully
EdgeRuntime-->>MainServer: Worker instance
MainServer->>Worker: Forward HTTP request
Worker->>Worker: Execute model inference<br/>(feature-extraction,<br/>text-classification, etc.)
Worker-->>MainServer: Response
MainServer->>Client: Return worker response
end
end
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 5
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In @.github/workflows/ci.yml:
- Around line 150-161: The cache step is using `${{ env.EXT_AI_CACHE_DIR }}` but
EXT_AI_CACHE_DIR is only appended to .env after the export action, so the env
var will be empty; fix by defining EXT_AI_CACHE_DIR at workflow-level (top-level
env block) or by replacing `${{ env.EXT_AI_CACHE_DIR }}` in the actions/cache
step with the literal path (e.g., $HOME/.cache/ext-ai-cache), and ensure any
step referencing EXT_AI_CACHE_DIR (the "Set AI model cache directory" echo and
the "Cache AI models" step using actions/cache@v4) uses the same defined value
so the cache path is not empty.
In `@examples/transformers-js/main/index.ts`:
- Around line 42-53: The code currently builds envVars from Deno.env.toObject()
(envVarsObj / envVars) and forwards the entire process environment into
EdgeRuntime.userWorkers.create, which risks leaking secrets; change this to
construct a minimal allowlist of environment keys required by the worker (e.g.,
an array of allowed names), populate envVars only with those keys from
envVarsObj, and pass that filtered envVars into EdgeRuntime.userWorkers.create
(keep referencing envVarsObj, envVars and the call site
EdgeRuntime.userWorkers.create to locate where to implement the filter).
- Around line 3-13: The interval-based cleanup spawns overlapping async calls
because setInterval does not await the callback; add an in-flight guard (e.g., a
module-scoped boolean named cleanupInFlight) checked at the top of the
setInterval handler tied to the existing handle so if cleanupInFlight is true
you return early, set cleanupInFlight = true before calling
EdgeRuntime.ai.tryCleanupUnusedSession(), and reset cleanupInFlight = false in a
finally block so only one tryCleanupUnusedSession() runs at a time and
concurrent execution is prevented.
- Around line 23-35: The code treats URL.pathname (variable pathname ->
service_name) as valid even when it's "/" because URL.pathname always begins
with "/" — fix by normalizing and validating the pathname: strip leading and
trailing slashes and whitespace from pathname (e.g., convert pathname ->
service_name = pathname.replace(/^\/+|\/+$/g, '') or similar), then check if
service_name is empty (or equals "." or "..") and return the 400 JSON response
if so; only after that build servicePath with
path.join('./examples/transformers-js', service_name) so servicePath is never
created from "/" or an invalid name.
In `@examples/transformers-js/question-answering/index.ts`:
- Around line 20-23: The request handler in the Deno.serve block is ignoring
incoming payloads by always using SampleInput; change the handler to parse the
request body with await req.json() (cast to Payload) to obtain input and context
instead of unconditionally using SampleInput, and optionally fall back to
SampleInput only when req.json() is empty or invalid; update the variables used
in the handler (input, context) accordingly so client requests are processed.
ℹ️ Review info
Configuration used: Central YAML (base), Organization UI (inherited)
Review profile: CHILL
Plan: Pro
Cache: Disabled due to Reviews > Disable Cache setting
Disabled knowledge base sources:
- Linear integration is disabled
You can enable these sources in your CodeRabbit configuration.
📒 Files selected for processing (117)
.github/workflows/ci.ymlcrates/base/test_cases/ai-ort-rust-backend/transformers-js/feature-extraction-cache/index.tscrates/base/test_cases/ai-ort-rust-backend/transformers-js/feature-extraction/index.tscrates/base/test_cases/ai-ort-rust-backend/transformers-js/fill-mask-cache/index.tscrates/base/test_cases/ai-ort-rust-backend/transformers-js/fill-mask/__snapshot__/linux_aarch64.jsoncrates/base/test_cases/ai-ort-rust-backend/transformers-js/fill-mask/__snapshot__/linux_x86_64.jsoncrates/base/test_cases/ai-ort-rust-backend/transformers-js/fill-mask/__snapshot__/macos_aarch64.jsoncrates/base/test_cases/ai-ort-rust-backend/transformers-js/fill-mask/__snapshot__/macos_x86_64.jsoncrates/base/test_cases/ai-ort-rust-backend/transformers-js/fill-mask/index.tscrates/base/test_cases/ai-ort-rust-backend/transformers-js/image-classification/__snapshot__/linux_aarch64.jsoncrates/base/test_cases/ai-ort-rust-backend/transformers-js/image-classification/__snapshot__/linux_x86_64.jsoncrates/base/test_cases/ai-ort-rust-backend/transformers-js/image-classification/__snapshot__/macos_aarch64.jsoncrates/base/test_cases/ai-ort-rust-backend/transformers-js/image-classification/__snapshot__/macos_x86_64.jsoncrates/base/test_cases/ai-ort-rust-backend/transformers-js/image-classification/index.tscrates/base/test_cases/ai-ort-rust-backend/transformers-js/image-feature-extraction/__snapshot__/linux_aarch64.jsoncrates/base/test_cases/ai-ort-rust-backend/transformers-js/image-feature-extraction/__snapshot__/linux_x86_64.jsoncrates/base/test_cases/ai-ort-rust-backend/transformers-js/image-feature-extraction/__snapshot__/macos_aarch64.jsoncrates/base/test_cases/ai-ort-rust-backend/transformers-js/image-feature-extraction/__snapshot__/macos_x86_64.jsoncrates/base/test_cases/ai-ort-rust-backend/transformers-js/image-feature-extraction/index.tscrates/base/test_cases/ai-ort-rust-backend/transformers-js/question-answering-cache/index.tscrates/base/test_cases/ai-ort-rust-backend/transformers-js/question-answering/__snapshot__/linux_aarch64.jsoncrates/base/test_cases/ai-ort-rust-backend/transformers-js/question-answering/__snapshot__/linux_x86_64.jsoncrates/base/test_cases/ai-ort-rust-backend/transformers-js/question-answering/__snapshot__/macos_aarch64.jsoncrates/base/test_cases/ai-ort-rust-backend/transformers-js/question-answering/__snapshot__/macos_x86_64.jsoncrates/base/test_cases/ai-ort-rust-backend/transformers-js/question-answering/index.tscrates/base/test_cases/ai-ort-rust-backend/transformers-js/summarization-cache/index.tscrates/base/test_cases/ai-ort-rust-backend/transformers-js/summarization/index.tscrates/base/test_cases/ai-ort-rust-backend/transformers-js/text-classification-cache/index.tscrates/base/test_cases/ai-ort-rust-backend/transformers-js/text-classification/__snapshot__/linux_aarch64.jsoncrates/base/test_cases/ai-ort-rust-backend/transformers-js/text-classification/__snapshot__/linux_x86_64.jsoncrates/base/test_cases/ai-ort-rust-backend/transformers-js/text-classification/__snapshot__/macos_aarch64.jsoncrates/base/test_cases/ai-ort-rust-backend/transformers-js/text-classification/__snapshot__/macos_x86_64.jsoncrates/base/test_cases/ai-ort-rust-backend/transformers-js/text-classification/index.tscrates/base/test_cases/ai-ort-rust-backend/transformers-js/text-generation-cache/index.tscrates/base/test_cases/ai-ort-rust-backend/transformers-js/text-generation/index.tscrates/base/test_cases/ai-ort-rust-backend/transformers-js/text2text-generation-cache/index.tscrates/base/test_cases/ai-ort-rust-backend/transformers-js/text2text-generation/index.tscrates/base/test_cases/ai-ort-rust-backend/transformers-js/token-classification-cache/index.tscrates/base/test_cases/ai-ort-rust-backend/transformers-js/token-classification/__snapshot__/linux_aarch64.jsoncrates/base/test_cases/ai-ort-rust-backend/transformers-js/token-classification/__snapshot__/linux_x86_64.jsoncrates/base/test_cases/ai-ort-rust-backend/transformers-js/token-classification/__snapshot__/macos_aarch64.jsoncrates/base/test_cases/ai-ort-rust-backend/transformers-js/token-classification/__snapshot__/macos_x86_64.jsoncrates/base/test_cases/ai-ort-rust-backend/transformers-js/token-classification/index.tscrates/base/test_cases/ai-ort-rust-backend/transformers-js/translation-cache/index.tscrates/base/test_cases/ai-ort-rust-backend/transformers-js/translation/index.tscrates/base/test_cases/ai-ort-rust-backend/transformers-js/zero-shot-classification-cache/index.tscrates/base/test_cases/ai-ort-rust-backend/transformers-js/zero-shot-classification/__snapshot__/linux_aarch64.jsoncrates/base/test_cases/ai-ort-rust-backend/transformers-js/zero-shot-classification/__snapshot__/linux_x86_64.jsoncrates/base/test_cases/ai-ort-rust-backend/transformers-js/zero-shot-classification/__snapshot__/macos_aarch64.jsoncrates/base/test_cases/ai-ort-rust-backend/transformers-js/zero-shot-classification/__snapshot__/macos_x86_64.jsoncrates/base/test_cases/ai-ort-rust-backend/transformers-js/zero-shot-classification/index.tscrates/base/test_cases/ai-ort-rust-backend/transformers-js/zero-shot-image-classification/__snapshot__/linux_aarch64.jsoncrates/base/test_cases/ai-ort-rust-backend/transformers-js/zero-shot-image-classification/__snapshot__/linux_x86_64.jsoncrates/base/test_cases/ai-ort-rust-backend/transformers-js/zero-shot-image-classification/__snapshot__/macos_aarch64.jsoncrates/base/test_cases/ai-ort-rust-backend/transformers-js/zero-shot-image-classification/__snapshot__/macos_x86_64.jsoncrates/base/test_cases/ai-ort-rust-backend/transformers-js/zero-shot-image-classification/index.tscrates/base/tests/integration_tests.rsexamples/ort-rust-backend/index.tsexamples/transformers-js/feature-extraction/__snapshot__/linux_aarch64.jsonexamples/transformers-js/feature-extraction/__snapshot__/linux_x86_64.jsonexamples/transformers-js/feature-extraction/__snapshot__/macos_aarch64.jsonexamples/transformers-js/feature-extraction/__snapshot__/macos_x86_64.jsonexamples/transformers-js/feature-extraction/index.tsexamples/transformers-js/feature-extraction/sample_input.jsonexamples/transformers-js/fill-mask/__snapshot__/linux_aarch64.jsonexamples/transformers-js/fill-mask/__snapshot__/linux_x86_64.jsonexamples/transformers-js/fill-mask/__snapshot__/macos_aarch64.jsonexamples/transformers-js/fill-mask/__snapshot__/macos_x86_64.jsonexamples/transformers-js/fill-mask/index.tsexamples/transformers-js/fill-mask/sample_input.jsonexamples/transformers-js/image-classification/__snapshot__/linux_aarch64.jsonexamples/transformers-js/image-classification/__snapshot__/linux_x86_64.jsonexamples/transformers-js/image-classification/__snapshot__/macos_aarch64.jsonexamples/transformers-js/image-classification/__snapshot__/macos_x86_64.jsonexamples/transformers-js/image-classification/index.tsexamples/transformers-js/image-classification/sample_input.jsonexamples/transformers-js/image-feature-extraction/__snapshot__/linux_aarch64.jsonexamples/transformers-js/image-feature-extraction/__snapshot__/linux_x86_64.jsonexamples/transformers-js/image-feature-extraction/__snapshot__/macos_aarch64.jsonexamples/transformers-js/image-feature-extraction/__snapshot__/macos_x86_64.jsonexamples/transformers-js/image-feature-extraction/index.tsexamples/transformers-js/image-feature-extraction/sample_input.jsonexamples/transformers-js/main/index.tsexamples/transformers-js/question-answering/__snapshot__/linux_aarch64.jsonexamples/transformers-js/question-answering/__snapshot__/linux_x86_64.jsonexamples/transformers-js/question-answering/__snapshot__/macos_aarch64.jsonexamples/transformers-js/question-answering/__snapshot__/macos_x86_64.jsonexamples/transformers-js/question-answering/index.tsexamples/transformers-js/question-answering/sample_input.jsonexamples/transformers-js/text-classification/__snapshot__/linux_aarch64.jsonexamples/transformers-js/text-classification/__snapshot__/linux_x86_64.jsonexamples/transformers-js/text-classification/__snapshot__/macos_aarch64.jsonexamples/transformers-js/text-classification/__snapshot__/macos_x86_64.jsonexamples/transformers-js/text-classification/index.tsexamples/transformers-js/text-classification/sample_input.jsonexamples/transformers-js/text-generation/index.tsexamples/transformers-js/text-generation/sample_input.jsonexamples/transformers-js/text2text-generation/index.tsexamples/transformers-js/text2text-generation/sample_input.jsonexamples/transformers-js/token-classification/__snapshot__/linux_aarch64.jsonexamples/transformers-js/token-classification/__snapshot__/linux_x86_64.jsonexamples/transformers-js/token-classification/__snapshot__/macos_aarch64.jsonexamples/transformers-js/token-classification/__snapshot__/macos_x86_64.jsonexamples/transformers-js/token-classification/index.tsexamples/transformers-js/token-classification/sample_input.jsonexamples/transformers-js/zero-shot-classification/__snapshot__/linux_aarch64.jsonexamples/transformers-js/zero-shot-classification/__snapshot__/linux_x86_64.jsonexamples/transformers-js/zero-shot-classification/__snapshot__/macos_aarch64.jsonexamples/transformers-js/zero-shot-classification/__snapshot__/macos_x86_64.jsonexamples/transformers-js/zero-shot-classification/index.tsexamples/transformers-js/zero-shot-classification/sample_input.jsonexamples/transformers-js/zero-shot-image-classification/__snapshot__/linux_aarch64.jsonexamples/transformers-js/zero-shot-image-classification/__snapshot__/linux_x86_64.jsonexamples/transformers-js/zero-shot-image-classification/__snapshot__/macos_aarch64.jsonexamples/transformers-js/zero-shot-image-classification/__snapshot__/macos_x86_64.jsonexamples/transformers-js/zero-shot-image-classification/index.tsexamples/transformers-js/zero-shot-image-classification/sample_input.json
💤 Files with no reviewable changes (56)
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/text-classification/snapshot/linux_x86_64.json
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/zero-shot-classification/snapshot/linux_x86_64.json
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/question-answering/snapshot/macos_x86_64.json
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/question-answering/snapshot/linux_aarch64.json
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/fill-mask-cache/index.ts
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/zero-shot-classification/snapshot/macos_x86_64.json
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/fill-mask/snapshot/macos_x86_64.json
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/token-classification/snapshot/linux_aarch64.json
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/text2text-generation-cache/index.ts
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/image-feature-extraction/snapshot/macos_x86_64.json
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/image-classification/snapshot/macos_x86_64.json
- crates/base/tests/integration_tests.rs
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/image-feature-extraction/index.ts
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/zero-shot-classification-cache/index.ts
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/token-classification-cache/index.ts
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/text-classification/index.ts
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/zero-shot-image-classification/index.ts
- examples/ort-rust-backend/index.ts
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/fill-mask/index.ts
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/question-answering/snapshot/linux_x86_64.json
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/image-classification/index.ts
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/image-classification/snapshot/macos_aarch64.json
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/question-answering/index.ts
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/question-answering/snapshot/macos_aarch64.json
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/image-classification/snapshot/linux_x86_64.json
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/text-generation-cache/index.ts
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/zero-shot-image-classification/snapshot/linux_aarch64.json
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/zero-shot-classification/snapshot/macos_aarch64.json
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/token-classification/snapshot/macos_x86_64.json
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/image-feature-extraction/snapshot/linux_x86_64.json
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/token-classification/snapshot/linux_x86_64.json
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/zero-shot-image-classification/snapshot/linux_x86_64.json
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/image-classification/snapshot/linux_aarch64.json
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/image-feature-extraction/snapshot/macos_aarch64.json
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/feature-extraction-cache/index.ts
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/summarization-cache/index.ts
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/text-classification/snapshot/macos_x86_64.json
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/fill-mask/snapshot/linux_x86_64.json
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/translation-cache/index.ts
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/image-feature-extraction/snapshot/linux_aarch64.json
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/text-generation/index.ts
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/question-answering-cache/index.ts
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/token-classification/snapshot/macos_aarch64.json
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/fill-mask/snapshot/macos_aarch64.json
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/text2text-generation/index.ts
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/zero-shot-image-classification/snapshot/macos_x86_64.json
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/text-classification/snapshot/linux_aarch64.json
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/zero-shot-image-classification/snapshot/macos_aarch64.json
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/fill-mask/snapshot/linux_aarch64.json
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/zero-shot-classification/index.ts
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/text-classification-cache/index.ts
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/text-classification/snapshot/macos_aarch64.json
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/token-classification/index.ts
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/summarization/index.ts
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/translation/index.ts
- crates/base/test_cases/ai-ort-rust-backend/transformers-js/zero-shot-classification/snapshot/linux_aarch64.json
| const handle = setInterval(async () => { | ||
| try { | ||
| const cleanupCount = await EdgeRuntime.ai.tryCleanupUnusedSession(); | ||
| if (cleanupCount == 0) { | ||
| return; | ||
| } | ||
| console.log('EdgeRuntime.ai.tryCleanupUnusedSession', cleanupCount); | ||
| } catch (e) { | ||
| console.error(e.toString()); | ||
| } | ||
| }, 100); |
There was a problem hiding this comment.
🧩 Analysis chain
🌐 Web query:
JavaScript setInterval async callback await behavior overlapping execution
💡 Result:
setInterval(fn, ms) schedules a task to call fn every ms, but it does not wait for fn to “finish” in the Promise/await sense—it only waits for the current synchronous JS call stack to be free before running the next queued interval task. Timers also aren’t guaranteed to run exactly on schedule. [1], [2]
What happens with an async callback
setInterval(async () => {
await doWork(); // yields back to the event loop
}, 1000);- No simultaneous JS execution on one thread: two interval callbacks won’t literally run at the same time on the main thread; a later callback can’t start until the current synchronous turn yields/ends. [3]
- But you can get overlapping work: once the callback hits
await, it yields; the interval can fire again later and start another callback while the previousdoWork()is still pending. Result: multiple in-flight operations (HTTP requests, DB calls, etc.) running concurrently and completing out of order. [3]
If you need “no overlap”, common fixes
- Lock/guard (skip if still running) (pattern widely recommended):
let running = false;
setInterval(async () => {
if (running) return;
running = true;
try { await doWork(); }
finally { running = false; }
}, 1000);- Recursive
setTimeout(serialize by scheduling after completion):
async function loop() {
await doWork();
setTimeout(loop, 1000);
}
loop();- Use a purpose-built helper like
set-interval-asyncwhich is designed to prevent overlapping executions. [4]
Sources: MDN setInterval docs (behavior + timing caveats) [1]; HTML Standard timers notes (no schedule guarantee) [2]; explanation of why async work can overlap even though JS is single-threaded (event loop) [3]; set-interval-async package description [4].
[1] (developer.mozilla.org)
[2] (html.spec.whatwg.org)
[3] (stackoverflow.com)
[4] (npmjs.com)
Citations:
- 1: https://developer.mozilla.org/en-US/docs/Web/API/Window/setInterval?utm_source=openai
- 2: https://html.spec.whatwg.org/multipage/timers.html?utm_source=openai
- 3: https://stackoverflow.com/questions/9219665/javascript-setinterval-do-calls-overlap?utm_source=openai
- 4: https://www.npmjs.com/package/set-interval-async?utm_source=openai
Fix overlapping async cleanup calls
setInterval does not wait for async callbacks to complete before scheduling the next tick. The 100ms interval will fire again while EdgeRuntime.ai.tryCleanupUnusedSession() is still in flight, causing unbounded concurrent cleanup operations. Add an in-flight guard:
let cleanupInFlight = false;
const handle = setInterval(async () => {
if (cleanupInFlight) return;
cleanupInFlight = true;
try {
const cleanupCount = await EdgeRuntime.ai.tryCleanupUnusedSession();
if (cleanupCount == 0) {
return;
}
console.log('EdgeRuntime.ai.tryCleanupUnusedSession', cleanupCount);
} catch (e) {
console.error(e.toString());
} finally {
cleanupInFlight = false;
}
}, 100);🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@examples/transformers-js/main/index.ts` around lines 3 - 13, The
interval-based cleanup spawns overlapping async calls because setInterval does
not await the callback; add an in-flight guard (e.g., a module-scoped boolean
named cleanupInFlight) checked at the top of the setInterval handler tied to the
existing handle so if cleanupInFlight is true you return early, set
cleanupInFlight = true before calling EdgeRuntime.ai.tryCleanupUnusedSession(),
and reset cleanupInFlight = false in a finally block so only one
tryCleanupUnusedSession() runs at a time and concurrent execution is prevented.
| const service_name = pathname; | ||
| console.log(service_name); | ||
|
|
||
| if (!service_name || service_name === '') { | ||
| const error = { msg: 'missing function name in request' }; | ||
| return new Response( | ||
| JSON.stringify(error), | ||
| { status: 400, headers: { 'Content-Type': 'application/json' } }, | ||
| ); | ||
| } | ||
|
|
||
| const servicePath = path.join('./examples/transformers-js', pathname); | ||
|
|
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
cat -n examples/transformers-js/main/index.ts | head -50Repository: supabase/edge-runtime
Length of output: 1863
🌐 Web query:
JavaScript URL API - does new URL("http://localhost/").pathname return empty string or "/"?
💡 Result:
It returns "/", not an empty string.
For HTTP/HTTPS URLs (a “special”/hierarchical scheme), the URL always has at least one (possibly empty) path segment, so pathname always contains at least one slash. (developer.mozilla.org)
So:
new URL("http://localhost/").pathname // "/"(An empty string "" can occur for non-hierarchical/opaque schemes like data:.) (developer.mozilla.org)
Citations:
- 1: https://developer.mozilla.org/en-US/docs/Web/API/URL/pathname?utm_source=openai
- 2: https://developer.mozilla.org/en-US/docs/Web/API/URL/pathname?utm_source=openai
Root path "/" passes validation incorrectly
Line 26 checks !service_name || service_name === '', but URL.pathname always returns at least "/" for HTTP requests. Root requests pass this guard and attempt to create a worker with invalid path, causing failures downstream. Normalize and validate the pathname to extract an actual service name.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@examples/transformers-js/main/index.ts` around lines 23 - 35, The code treats
URL.pathname (variable pathname -> service_name) as valid even when it's "/"
because URL.pathname always begins with "/" — fix by normalizing and validating
the pathname: strip leading and trailing slashes and whitespace from pathname
(e.g., convert pathname -> service_name = pathname.replace(/^\/+|\/+$/g, '') or
similar), then check if service_name is empty (or equals "." or "..") and return
the 400 JSON response if so; only after that build servicePath with
path.join('./examples/transformers-js', service_name) so servicePath is never
created from "/" or an invalid name.
| const envVarsObj = Deno.env.toObject(); | ||
| const envVars = Object.keys(envVarsObj).map((k) => [k, envVarsObj[k]]); | ||
|
|
||
| return await EdgeRuntime.userWorkers.create({ | ||
| servicePath, | ||
| memoryLimitMb, | ||
| workerTimeoutMs, | ||
| cpuTimeSoftLimitMs, | ||
| cpuTimeHardLimitMs, | ||
| noModuleCache, | ||
| envVars, | ||
| }); |
There was a problem hiding this comment.
Forwarding the entire environment to workers is a secret-leak risk
Line 42-43 exports every process env var into the worker. This can expose CI/runtime secrets to example handlers. Only pass a minimal allowlist required by these services.
Suggested fix
- const envVarsObj = Deno.env.toObject();
- const envVars = Object.keys(envVarsObj).map((k) => [k, envVarsObj[k]]);
+ const allowedEnvKeys = [
+ // add only what handlers need, e.g. "HF_HOME", "TRANSFORMERS_CACHE"
+ ];
+ const envVars = allowedEnvKeys
+ .map((k) => [k, Deno.env.get(k)])
+ .filter(([, v]) => v !== undefined) as [string, string][];As per coding guidelines, "Highlight only issues that could cause runtime errors, data loss, or severe maintainability issues."
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@examples/transformers-js/main/index.ts` around lines 42 - 53, The code
currently builds envVars from Deno.env.toObject() (envVarsObj / envVars) and
forwards the entire process environment into EdgeRuntime.userWorkers.create,
which risks leaking secrets; change this to construct a minimal allowlist of
environment keys required by the worker (e.g., an array of allowed names),
populate envVars only with those keys from envVarsObj, and pass that filtered
envVars into EdgeRuntime.userWorkers.create (keep referencing envVarsObj,
envVars and the call site EdgeRuntime.userWorkers.create to locate where to
implement the filter).
| Deno.serve(async (req: Request) => { | ||
| //const { input, context } = await req.json() as Payload; | ||
| const { input, context } = SampleInput; | ||
|
|
There was a problem hiding this comment.
Handler ignores request payload and always returns QA for static sample
At Line 20–23, the server never reads req.json() and always uses SampleInput, so client input cannot be processed. That is a functional break for an HTTP example endpoint.
Proposed fix
Deno.serve(async (req: Request) => {
- //const { input, context } = await req.json() as Payload;
- const { input, context } = SampleInput;
+ const { input, context } =
+ req.method === 'POST'
+ ? (await req.json() as Payload)
+ : SampleInput;
const output = await pipe(input, context);
// use '__snapshot__' to assert results
return Response.json(output);
});As per coding guidelines, "Comment only when the issue must be resolved before merge — otherwise remain silent."
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@examples/transformers-js/question-answering/index.ts` around lines 20 - 23,
The request handler in the Deno.serve block is ignoring incoming payloads by
always using SampleInput; change the handler to parse the request body with
await req.json() (cast to Payload) to obtain input and context instead of
unconditionally using SampleInput, and optionally fall back to SampleInput only
when req.json() is empty or invalid; update the variables used in the handler
(input, context) accordingly so client requests are processed.
7ef353f to
11d7257
Compare
What kind of change does this PR introduce?
Tests refactor
What is the current behavior?
AI models are not cached during CI
What is the new behavior?
Additional context:
AI examples are using a custom
transformers-js/main/index.tsentrypoint