{"versions":{"1.0.0":{"name":"gpt-tokenizer","version":"1.0.0","description":"BPE Encoder Decoder for GPT-2 / GPT-3","keywords":["BPE","encoder","decoder","tokenizer","GPT","GPT-2","GPT-3","GPT-3.5","GPT-4","NLP","Natural Language Processing","Text Generation","OpenAI","Machine Learning","ml"],"homepage":"https://github.com/niieani/gpt-tokenizer#readme","bugs":{"url":"https://github.com/niieani/gpt-tokenizer/issues"},"repository":{"type":"git","url":"git+https://github.com/niieani/gpt-tokenizer.git"},"license":"MIT","author":{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"},"contributors":[{"name":"Nick Walton","url":"https://github.com/nickwalton"},{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"}],"exports":{".":{"import":"./esm/main.js","require":"./cjs/main.js"},"./*":{"import":"./esm/*.js","require":"./cjs/*.js"},"./cjs":{"require":"./cjs/main.js"},"./cjs/*":{"require":"./cjs/*.js"},"./esm/*":{"import":"./esm/*.js"},"./package.json":"./package.json"},"main":"esm/main.js","unpkg":"dist/main.js","module":"esm/main.js","source":"src/main.ts","scripts":{"build":"yarn build:cjs && yarn build:esm && yarn build:umd","build:cjs":"yarn rrun tsc --outDir cjs --module commonjs --target es2022","build:esm":"yarn rrun tsc --outDir esm --module esnext --target es2022","build:umd":"beemo webpack --entry='./src/main.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPT3Encoder' --env 'export=default' --env 'filename=gpt3encoder.js'","clean":"git clean -dfX --exclude=node_modules src && beemo typescript:sync-project-refs","format":"yarn rrun prettier --write \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","postinstallDev":"yarn prepare","prepare":"rrun husky install .config/husky && beemo create-config","release":"beemo run-script release","test":"yarn test:format && yarn test:types && yarn test:lint && yarn test:code","test:code":"rrun jest","test:format":"yarn rrun prettier --check \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","test:lint":"rrun eslint 'src/**/*.{js,jsx,ts,tsx}'","test:types":"yarn rrun tsc --noEmit"},"release":{"branches":["+([0-9])?(.{+([0-9]),x}).x","master",{"name":"main","channel":false},"next",{"name":"beta","prerelease":true},{"name":"alpha","prerelease":true}],"tagFormat":"${version}"},"devDependencies":{"@niieani/scaffold":"^1.6.7","tsx":"^3.12.6"},"packageManager":"yarn@3.5.0","publishConfig":{"access":"public"},"types":"./esm/main.d.ts","gitHead":"6b7f6e6564f914c64216a1d8f0fb8464582c01f9","_id":"gpt-tokenizer@1.0.0","_nodeVersion":"16.20.0","_npmVersion":"8.19.4","dist":{"integrity":"sha512-WXhpEj4YVNFHvhTx0EKOaZ6jyGcU1uCgPRLzANAw1hBz3NKSRGLdRtrLzrUSRLEIu6OMrbxuZX1Jh20bX5XNfw==","shasum":"06f06058b4be01fd43b372e6fc8af1c11539cf69","tarball":"http://123.232.10.234:8212/nexus/content/groups/npm-public/gpt-tokenizer/-/gpt-tokenizer-1.0.0.tgz","fileCount":38,"unpackedSize":78237,"signatures":[{"keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA","sig":"MEYCIQCfMmuNsTmXMOF3Lf7DAU1OzEFjIqhDe7KBDEgfHXuikQIhAMR+GDFbadkz9KWC7TJ8gIuR3FQszMGgQeW+ZvIHDLN4"}],"npm-signature":"-----BEGIN PGP SIGNATURE-----\r\nVersion: OpenPGP.js v4.10.10\r\nComment: https://openpgpjs.org\r\n\r\nwsFzBAEBCAAGBQJkO5cqACEJED1NWxICdlZqFiEECWMYAoorWMhJKdjhPU1b\r\nEgJ2Vmp8bQ/+OJ1NZZ9ybr/fuNTheOcEw3VkWeij4WHJiTa35InB+EnWsFzW\r\nbH9JcZh9u4DF7E67U7mL3iK9lcOkIPYCnGSVln6j1sX7XUME8vsvDScmAKIU\r\nvxX5YQKQiaPFZbB1O3FwrrDq+dCxoNt0eStTQs3B1FcF1KxiK0N/IVMdhYfX\r\n3eGN3i+VNO0FXcYac6icMu+RbnN+jUTfb270DeqhNeQKo7J/7ynI65pxX2Kb\r\nckLLxWJdI0o/+FGzRvgl6f1lSSmQfYCY4/Ax1I4zEAH5rjtJ+/tyAa8qOgv1\r\ne4cjjzbYMQwtwIQe17X/2/Sjfb8/d4R2M6HBOv7SkCDL+BWGankDraHEUEeR\r\nm2TbqbM1FsSBim6Xysbxq8sT8eoRk/8ND6LPUBQSc7ESZ6AkwNx8yEcud78r\r\nCr2mUf/y4iVnwBHU/hWCjXntdNZ+u8xxgoCsI9/sJ4DW8kSRdQMnTw6NnE7Z\r\ncCa0cMTSm5Ynj57tmgAYfvrgBYVCJxOHlNaU9Zjb580QpzZzoPsBwRQeymic\r\n+v9pPGyG7qj0I1ES/eXkSonicP1SPXOxO3U6WUIoplrH02wJh6RZol76Akdu\r\nRIs0tBBjEVT65f4fr08mrz1jrdCqyB5wXJm/C0vzPVi4K4KYsZvFI0P/TZ7m\r\npOaqHy/sluss5vZVDH914AWU7Hi0qQvgbnA=\r\n=2QRA\r\n-----END PGP SIGNATURE-----\r\n","size":15791},"_npmUser":{"name":"anonymous","email":"npm@invent.life"},"directories":{},"maintainers":[{"name":"anonymous","email":"npm@invent.life"}],"_npmOperationalInternal":{"host":"s3://npm-registry-packages","tmp":"tmp/gpt-tokenizer_1.0.0_1681626922352_0.1432854798465757"},"_hasShrinkwrap":false,"_cnpmcore_publish_time":"2023-04-16T06:35:22.514Z","publish_time":1681626922514},"1.0.1":{"name":"gpt-tokenizer","version":"1.0.1","description":"BPE Encoder Decoder for GPT-2 / GPT-3","keywords":["BPE","encoder","decoder","tokenizer","GPT","GPT-2","GPT-3","GPT-3.5","GPT-4","NLP","Natural Language Processing","Text Generation","OpenAI","Machine Learning","ml"],"homepage":"https://github.com/niieani/gpt-tokenizer#readme","bugs":{"url":"https://github.com/niieani/gpt-tokenizer/issues"},"repository":{"type":"git","url":"git+https://github.com/niieani/gpt-tokenizer.git"},"license":"MIT","author":{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"},"contributors":[{"name":"Nick Walton","url":"https://github.com/nickwalton"},{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"}],"exports":{".":{"import":"./esm/main.js","require":"./cjs/main.js"},"./*":{"import":"./esm/*.js","require":"./cjs/*.js"},"./cjs":{"require":"./cjs/main.js"},"./cjs/*":{"require":"./cjs/*.js"},"./esm/*":{"import":"./esm/*.js"},"./package.json":"./package.json"},"main":"esm/main.js","unpkg":"dist/main.js","module":"esm/main.js","source":"src/main.ts","scripts":{"build":"yarn build:cjs && yarn build:esm && yarn build:umd","build:cjs":"yarn rrun tsc --outDir cjs --module commonjs --target es2022","build:esm":"yarn rrun tsc --outDir esm --module esnext --target es2022","build:umd":"beemo webpack --entry='./src/main.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPT3Encoder' --env 'export=default' --env 'filename=gpt3encoder.js'","clean":"git clean -dfX --exclude=node_modules src && beemo typescript:sync-project-refs","format":"yarn rrun prettier --write \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","postinstallDev":"yarn prepare","prepare":"rrun husky install .config/husky && beemo create-config","release":"beemo run-script release","test":"yarn test:format && yarn test:types && yarn test:lint && yarn test:code","test:code":"rrun jest","test:format":"yarn rrun prettier --check \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","test:lint":"rrun eslint 'src/**/*.{js,jsx,ts,tsx}'","test:types":"yarn rrun tsc --noEmit"},"release":{"branches":["+([0-9])?(.{+([0-9]),x}).x","master",{"name":"main","channel":false},"next",{"name":"beta","prerelease":true},{"name":"alpha","prerelease":true}],"tagFormat":"${version}"},"devDependencies":{"@niieani/scaffold":"^1.6.7","tsx":"^3.12.6"},"packageManager":"yarn@3.5.0","publishConfig":{"access":"public"},"types":"./esm/main.d.ts","gitHead":"3c12b3343db86fe1721fc37a559ed8f001387756","_id":"gpt-tokenizer@1.0.1","_nodeVersion":"16.20.0","_npmVersion":"8.19.4","dist":{"integrity":"sha512-2NCuA0lMto1+9nCub8VfD1Yxj1FLTueYevJ80v/aqyk/SEp0NhXlyXti5hGWmvtUo0ZA36cmG3nBDx6kUGYrZg==","shasum":"d1a1d76149f546309c733158a720ab56ec861058","tarball":"http://123.232.10.234:8212/nexus/content/groups/npm-public/gpt-tokenizer/-/gpt-tokenizer-1.0.1.tgz","fileCount":41,"unpackedSize":2422278,"signatures":[{"keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA","sig":"MEUCIQCAyHeL2iEUJ9XAH+zuKl/ldnfZo//93S48LstcYMLGKAIgcpHRiSIXx7Zj09BPpIPrQ344yoAIdCJykDb/sCOr+Rw="}],"npm-signature":"-----BEGIN PGP SIGNATURE-----\r\nVersion: OpenPGP.js v4.10.10\r\nComment: https://openpgpjs.org\r\n\r\nwsFzBAEBCAAGBQJkO5lDACEJED1NWxICdlZqFiEECWMYAoorWMhJKdjhPU1b\r\nEgJ2Vmqopw//X/DM1rO6v1TIQrSri3yqDHubexMo35p/LLAvc2gkwdEkmA83\r\nZUYg4AhOBmDPV+zEUXiQjjqNNnymxwvU/ECoYJIc2TdjdrPrizdLgpn+OpBy\r\nX4jJAhtyWBD+wXlyDoivXtchQqaTao1f9N1CNZX28ePUHLueL1xhrZsV2tn+\r\nzREIL4jWreI6DK9iabGfHITYwkBMp7NN65Sw58fBAKnLC0tsUN8dxjTzY62J\r\nb5M43w/htkbjJSVHYXk3hf6Rp+0AnNlEhQ06VBDs39IqTLvg24T0UclC99D9\r\nOUcLRrEUC7ahevY4nbahofS45e8tSz/k5VN2cwjOh2Dc9QQKyqzQvcQeXJB/\r\nBkH84enki/5yl1eGqiO8aASc5UXcMJeSda4HdM30cB4Eu51QuxADjm4zjV9/\r\ngXpjH+QtQArm/Y8/pa473b1zyZq4VvbZ3sRjyT9nnlA3N4s3LRzyRMq62hUa\r\nMC9L8m2LaZ7U0fgb3UUiH2Sms2rI2YnHgyfyta45IaH8aoP8glOVkIMxN4P/\r\nfk2IQhmeWPDzEC2vBqczYSI9GyOQ698o/QJSpeTf8JrMSF83VxsALrHVBcR1\r\nG39nKkmyW7E9Qpu2C6R/1PvFXCusiLyxQlcerzGjklRku6jDEPaVX9WEQs1R\r\nYyxnd4jnVwdDxGPWCg+5Vzp2pSlUTnYxoR4=\r\n=6Qon\r\n-----END PGP SIGNATURE-----\r\n","size":922083},"_npmUser":{"name":"anonymous","email":"npm@invent.life"},"directories":{},"maintainers":[{"name":"anonymous","email":"npm@invent.life"}],"_npmOperationalInternal":{"host":"s3://npm-registry-packages","tmp":"tmp/gpt-tokenizer_1.0.1_1681627458954_0.05592884879669113"},"_hasShrinkwrap":false,"_cnpmcore_publish_time":"2023-04-16T06:44:19.168Z","publish_time":1681627459168},"1.0.2":{"name":"gpt-tokenizer","version":"1.0.2","description":"BPE Encoder Decoder for GPT-2 / GPT-3","keywords":["BPE","encoder","decoder","tokenizer","GPT","GPT-2","GPT-3","GPT-3.5","GPT-4","NLP","Natural Language Processing","Text Generation","OpenAI","Machine Learning","ml"],"homepage":"https://github.com/niieani/gpt-tokenizer#readme","bugs":{"url":"https://github.com/niieani/gpt-tokenizer/issues"},"repository":{"type":"git","url":"git+https://github.com/niieani/gpt-tokenizer.git"},"license":"MIT","author":{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"},"contributors":[{"name":"Nick Walton","url":"https://github.com/nickwalton"},{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"}],"exports":{".":{"import":"./esm/main.js","require":"./cjs/main.js"},"./*":{"import":"./esm/*.js","require":"./cjs/*.js"},"./cjs":{"require":"./cjs/main.js"},"./cjs/*":{"require":"./cjs/*.js"},"./esm/*":{"import":"./esm/*.js"},"./package.json":"./package.json"},"main":"esm/main.js","unpkg":"dist/gpt3encoder.js","module":"esm/main.js","source":"src/main.ts","scripts":{"build":"yarn build:cjs && yarn build:esm && yarn build:umd","build:cjs":"yarn rrun tsc --outDir cjs --module commonjs --target es2022","build:esm":"yarn rrun tsc --outDir esm --module esnext --target es2022","build:umd":"beemo webpack --entry='./src/main.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPT3Encoder' --env 'export=default' --env 'filename=gpt3encoder.js'","clean":"git clean -dfX --exclude=node_modules src && beemo typescript:sync-project-refs","format":"yarn rrun prettier --write \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","postinstallDev":"yarn prepare","prepare":"rrun husky install .config/husky && beemo create-config","release":"beemo run-script release","test":"yarn test:format && yarn test:types && yarn test:lint && yarn test:code","test:code":"rrun jest","test:format":"yarn rrun prettier --check \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","test:lint":"rrun eslint 'src/**/*.{js,jsx,ts,tsx}'","test:types":"yarn rrun tsc --noEmit"},"release":{"branches":["+([0-9])?(.{+([0-9]),x}).x","master",{"name":"main","channel":false},"next",{"name":"beta","prerelease":true},{"name":"alpha","prerelease":true}],"tagFormat":"${version}"},"devDependencies":{"@niieani/scaffold":"^1.6.7","tsx":"^3.12.6"},"packageManager":"yarn@3.5.0","publishConfig":{"access":"public"},"types":"./esm/main.d.ts","gitHead":"30d5c41eee147321fc899327be818868ba933811","_id":"gpt-tokenizer@1.0.2","_nodeVersion":"16.20.0","_npmVersion":"8.19.4","dist":{"integrity":"sha512-Be4hw2UooWMMDnXNCMRl5DmRO9yBP6et9FSum6tmqP9GZZ6SxbJByRw2z1XHjW2kxcxhBtKMSrOVpLKOaNPFmg==","shasum":"1c380ed339717d727ed0f781da0f2e798dafe749","tarball":"http://123.232.10.234:8212/nexus/content/groups/npm-public/gpt-tokenizer/-/gpt-tokenizer-1.0.2.tgz","fileCount":41,"unpackedSize":2422867,"signatures":[{"keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA","sig":"MEUCIQDhxXnodqQh3SJHwNadKkdEcbJR0jB/DgBBWSYV8P/p/QIgV4sJ3ieNiyU79d1DVVYydgX0tvuTR0TKbL01dPrdVi0="}],"npm-signature":"-----BEGIN PGP SIGNATURE-----\r\nVersion: OpenPGP.js v4.10.10\r\nComment: https://openpgpjs.org\r\n\r\nwsFzBAEBCAAGBQJkPlHfACEJED1NWxICdlZqFiEECWMYAoorWMhJKdjhPU1b\r\nEgJ2Vmr9xA/9HI5BJzfVI9pfT9GkR3IyUh7aXO6HHtax8ykjw66yjdFvWwfo\r\nffxrmGQIW7xWb/3DfWzz1dLzJMAEE64Vvz0s93nLifEcjmP8YCwgFIiD3xmn\r\ns/4yIk2/IhJ+G0S+/3Ji/buvHhoMH3v5axCpJDtHtbQeExri7Hj9G2QOBv3Z\r\njN2UXCXABWXrVC4bYAC2yvT/NvyJCKZ8MnzCYEw3mWPa3Mhg42Jv5kcNKWh0\r\ndfdm+H5xbjPs9RsfGAPMbzRyxdU0cT0iv9q+L0GsX3eCdXdcrokR8dHsM1k4\r\nYAnwlM4PWpjob9QSmc7ShZuoMzsgjIxmXX5/XTbtM8ov7ehTPgUhc1m98c3/\r\nWTG13TohVfppydreSi05Z9PodsfxMp8w4T48XS89dGXOmRA6i3KDpqqozNEe\r\n9DI+q9rM7mBCRWOn7wGhgELxR5iBtP5B7Hkf2QNh2fLOzfwQpRTW9fnsn4L7\r\n06cI5EuUQ7bZhf7Kk+HHR/6Kyep92v0+0eJp28quTmplmXtC2zVOa26QVM0k\r\nKdDu1n2kULDTjJbq3sbHdKGR8GSf4qkWIosg5u/1sJC+RqGiLd3Qf4Tmva6u\r\ncqo5nW7tUIgJBcdTcQ6LaVsi6rNmNHebIR5q/lgJLuwt9z96hPCXOunz7VUb\r\nE9gv6/NzSXHUr2laTBoEoqlx75XVwScA1eA=\r\n=Kk6M\r\n-----END PGP SIGNATURE-----\r\n","size":922258},"_npmUser":{"name":"anonymous","email":"npm@invent.life"},"directories":{},"maintainers":[{"name":"anonymous","email":"npm@invent.life"}],"_npmOperationalInternal":{"host":"s3://npm-registry-packages","tmp":"tmp/gpt-tokenizer_1.0.2_1681805790929_0.27356390279653287"},"_hasShrinkwrap":false,"_cnpmcore_publish_time":"2023-04-18T08:16:31.211Z","publish_time":1681805791211},"1.0.3":{"name":"gpt-tokenizer","version":"1.0.3","description":"BPE Encoder Decoder for GPT-2 / GPT-3","keywords":["BPE","encoder","decoder","tokenizer","GPT","GPT-2","GPT-3","GPT-3.5","GPT-4","NLP","Natural Language Processing","Text Generation","OpenAI","Machine Learning","ml"],"homepage":"https://github.com/niieani/gpt-tokenizer#readme","bugs":{"url":"https://github.com/niieani/gpt-tokenizer/issues"},"repository":{"type":"git","url":"git+https://github.com/niieani/gpt-tokenizer.git"},"license":"MIT","author":{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"},"contributors":[{"name":"Nick Walton","url":"https://github.com/nickwalton"},{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"}],"exports":{".":{"import":"./esm/main.js","require":"./cjs/main.js"},"./*":{"import":"./esm/*.js","require":"./cjs/*.js"},"./cjs":{"require":"./cjs/main.js"},"./cjs/*":{"require":"./cjs/*.js"},"./esm/*":{"import":"./esm/*.js"},"./package.json":"./package.json"},"main":"esm/main.js","unpkg":"dist/gpt3encoder.js","module":"esm/main.js","source":"src/main.ts","scripts":{"build":"yarn build:cjs && yarn build:esm && yarn build:umd","build:cjs":"yarn rrun tsc --outDir cjs --module commonjs --target es2022","build:esm":"yarn rrun tsc --outDir esm --module esnext --target es2022","build:umd":"beemo webpack --entry='./src/main.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPT3Encoder' --env 'export=default' --env 'filename=gpt3encoder.js'","clean":"git clean -dfX --exclude=node_modules src && beemo typescript:sync-project-refs","format":"yarn rrun prettier --write \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","postinstallDev":"yarn prepare","prepare":"rrun husky install .config/husky && beemo create-config","release":"beemo run-script release","test":"yarn test:format && yarn test:types && yarn test:lint && yarn test:code","test:code":"rrun jest","test:format":"yarn rrun prettier --check \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","test:lint":"rrun eslint 'src/**/*.{js,jsx,ts,tsx}'","test:types":"yarn rrun tsc --noEmit"},"release":{"branches":["+([0-9])?(.{+([0-9]),x}).x","master",{"name":"main","channel":false},"next",{"name":"beta","prerelease":true},{"name":"alpha","prerelease":true}],"tagFormat":"${version}"},"devDependencies":{"@niieani/scaffold":"^1.6.7","tsx":"^3.12.6"},"packageManager":"yarn@3.5.0","publishConfig":{"access":"public"},"types":"./esm/main.d.ts","gitHead":"7b9e7fd45dc8216b9341828b3e6bd65351144e8a","_id":"gpt-tokenizer@1.0.3","_nodeVersion":"16.20.0","_npmVersion":"8.19.4","dist":{"integrity":"sha512-Ai0FlpRUVVLO8lW4XpuVM3wDaAWjvRr1i0P5tfpSX3pGKGPulwaHykjxdum8dbmYNsob5YvNDMwuS34Aj597cw==","shasum":"2a9325fc3e125a39308c4a386fea003c519b62aa","tarball":"http://123.232.10.234:8212/nexus/content/groups/npm-public/gpt-tokenizer/-/gpt-tokenizer-1.0.3.tgz","fileCount":41,"unpackedSize":2423210,"signatures":[{"keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA","sig":"MEUCIQCq5058RcVRIsWZzGYPnmmZALqJxK/JnnC6OLp0JB+eLAIgdvwe+kQGbRMVmuFRewXAmMmEjhk+3hTzZjUg7xCYS64="}],"npm-signature":"-----BEGIN PGP SIGNATURE-----\r\nVersion: OpenPGP.js v4.10.10\r\nComment: https://openpgpjs.org\r\n\r\nwsFzBAEBCAAGBQJkPlZsACEJED1NWxICdlZqFiEECWMYAoorWMhJKdjhPU1b\r\nEgJ2Vmpg9RAAk8XfApQPhFSFAqY/MrgdtJVWqmqU5L5Tj8Nghrfm+xwYxcCw\r\nLQGuFdbQUTM/+SnzCq4+Zg2IdnoVucNNhxZ55V9kQksSNBMhXBDRE+41Bch0\r\nGH0YULccFE4SyIK28mTLmpcHKxwgAMTPafptg8CPsIGy2tF/cipN4Z379vRG\r\ntEj7gjYfQ2zPXjkxTdWwdk8B/Jt7t6/d8d/Ja1l0xrHZeGLQBxsS2ge3gY4K\r\ndPrSxMcoYY++ydAS+gjFE/PjCEndOf4tVlUXr8QK3+FO2Ld7LGs3JRtQzUbd\r\n5GIauRmU85PLhaPaMQI29oZuuDshI1FtpP9kIZNTPFg59EE4WAHyVHWI2ulX\r\nyzpq+9oJ+HGpegUohtX7of2p9zv5/2EdOVs1Iv5CughQTvcd1txDbIW6+ALd\r\n1HKQ1t0S+m6BcbCW1SZXuxU0t872yOxVHLYTlMVDwQq2FyhcuS9AQbTyE4iq\r\nfQy0oXuszQsM0H1UtMTbXGyo8ERt2cSP8i6MUgoD521b1PANz0TbSTVHXFby\r\nHEoRTjP1jlqDz7x6HiPHraFnrMfwBGDYWP7/F64x1oVfge/BEgjHYxYi23le\r\nAf6sT0Ac+NbgyScsiDNocYzUG1hSJNDTgP+akh2H9WxWuXziihdETj+lJlVI\r\nM0zRngz6phqSq1KszBRZaUPuklbTK3ta16c=\r\n=6GDh\r\n-----END PGP SIGNATURE-----\r\n","size":922290},"_npmUser":{"name":"anonymous","email":"npm@invent.life"},"directories":{},"maintainers":[{"name":"anonymous","email":"npm@invent.life"}],"_npmOperationalInternal":{"host":"s3://npm-registry-packages","tmp":"tmp/gpt-tokenizer_1.0.3_1681806956677_0.06905464063470168"},"_hasShrinkwrap":false,"_cnpmcore_publish_time":"2023-04-18T08:35:56.926Z","publish_time":1681806956926},"1.0.4":{"name":"gpt-tokenizer","version":"1.0.4","description":"BPE Encoder Decoder for GPT-2 / GPT-3","keywords":["BPE","encoder","decoder","tokenizer","GPT","GPT-2","GPT-3","GPT-3.5","GPT-4","NLP","Natural Language Processing","Text Generation","OpenAI","Machine Learning","ml"],"homepage":"https://github.com/niieani/gpt-tokenizer#readme","bugs":{"url":"https://github.com/niieani/gpt-tokenizer/issues"},"repository":{"type":"git","url":"git+https://github.com/niieani/gpt-tokenizer.git"},"license":"MIT","author":{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"},"contributors":[{"name":"Nick Walton","url":"https://github.com/nickwalton"},{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"}],"exports":{".":{"import":"./esm/main.js","require":"./cjs/main.js"},"./*":{"import":"./esm/*.js","require":"./cjs/*.js"},"./cjs":{"require":"./cjs/main.js"},"./cjs/*":{"require":"./cjs/*.js"},"./esm/*":{"import":"./esm/*.js"},"./data/*":{"import":"./data/*","require":"./data/*"},"./package.json":"./package.json"},"main":"esm/main.js","unpkg":"dist/gpt3encoder.js","module":"esm/main.js","source":"src/main.ts","scripts":{"build":"yarn build:cjs && yarn build:esm && yarn build:umd","build:cjs":"yarn rrun tsc --outDir cjs --module commonjs --target es2022","build:esm":"yarn rrun tsc --outDir esm --module esnext --target es2022 && echo '{\"name\": \"gpt-tokenizer\", \"type\": \"module\"}' > ./esm/package.json","build:umd":"beemo webpack --entry='./src/main.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPT3Encoder' --env 'export=default' --env 'filename=gpt3encoder.js'","clean":"git clean -dfX --exclude=node_modules src && beemo typescript:sync-project-refs","format":"yarn rrun prettier --write \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","postinstallDev":"yarn prepare","prepare":"rrun husky install .config/husky && beemo create-config","release":"beemo run-script release","test":"yarn test:format && yarn test:types && yarn test:lint && yarn test:code","test:code":"rrun jest","test:format":"yarn rrun prettier --check \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","test:lint":"rrun eslint 'src/*.{js,jsx,ts,tsx}'","test:types":"yarn rrun tsc --noEmit"},"release":{"branches":["+([0-9])?(.{+([0-9]),x}).x","master",{"name":"main","channel":false},"next",{"name":"beta","prerelease":true},{"name":"alpha","prerelease":true}],"tagFormat":"${version}"},"devDependencies":{"@niieani/scaffold":"^1.6.7","tsx":"^3.12.7"},"packageManager":"yarn@3.5.0","publishConfig":{"access":"public"},"types":"./esm/main.d.ts","gitHead":"8b1cb0185e900f3b6f8ea8f75df7f59a287e0f89","_id":"gpt-tokenizer@1.0.4","_nodeVersion":"16.20.0","_npmVersion":"8.19.4","dist":{"integrity":"sha512-+3i5VztAObULo4auhcQMGArHZ3G8LCTI1oX0yy6YuzeXTelrfSNOUlhjVZcHJIVUK8uppI8Arqxpxp9wKc+l3w==","shasum":"ac0ace05b27587f3f9aede642d096699d397afc9","tarball":"http://123.232.10.234:8212/nexus/content/groups/npm-public/gpt-tokenizer/-/gpt-tokenizer-1.0.4.tgz","fileCount":49,"unpackedSize":16618650,"signatures":[{"keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA","sig":"MEQCIAeuHxhweFj9oearCbqJ4vD3MiZPJ6Hn5RT8kiS5m2KEAiAv+xPoZkF2baiUW+QqHJAEg3xzUcZRC2qPWUeeMe3RzQ=="}],"npm-signature":"-----BEGIN PGP SIGNATURE-----\r\nVersion: OpenPGP.js v4.10.10\r\nComment: https://openpgpjs.org\r\n\r\nwsFzBAEBCAAGBQJkT1GuACEJED1NWxICdlZqFiEECWMYAoorWMhJKdjhPU1b\r\nEgJ2VmpKFRAAgkKuM/IpJaryvKQ8bCU/ipGXTX25SLR5NSkwrP5IElKApwCa\r\nRQyJK/zQu3EY9mf+99h8AsnwujsZ0DvW212bulnFEjqW9eGvNXP0nGg1eRpp\r\nwFioyVHMgfx16GEkAKPHCsXbeWqbMVzyvG0XMtZtilhusR1O08BQU4vCiTdW\r\n7NahUh8yPG4bHK6CUA3/mqyQhJYS0ToIvKQj0BQmwTjjrbhO6iGR5XF2yKmP\r\nDhGNi0IIu+9Jt57KUAwU5pnAS37XUfD9qiPf/d0Tu4XLvbBMYbfeHYNZGUIE\r\n3fIosUQwFgWc/pSw0uDyN1mcXXTDZICYzoYz5qqtFK7G+yXkghBj/XP/2C2q\r\nDjlmO+cL7/PmPT2+q+YkvWFMzOFtQ+p09V5q2LbSBBugipNB+MXob2+SvGyj\r\niSYRgK/kL2u43v3wRERuqzHskBHt8/1mEpuwI34lCA8WsKj81EWKpl9PscaD\r\n/0zqkQa0CzKYgbjIs1XwZKNxaAX07XMqODPnmoyn3EK0fjGIwYIAHijwUp/Y\r\nP7F96xfxYBlN/mfdyOJgumnTFQ/IxEMbNyBfVJ2M1pzOqosY2YXRJy4jsnbo\r\nsURHn5PWwRBwVq14PGwsA8VcyRGxoqOiYrC0NtOdHL6gYWedYYLsTpypu70y\r\n4tWgQg2ZkWTqw1oT4Tqs4RAHPkFElsFXn9E=\r\n=JPlg\r\n-----END PGP SIGNATURE-----\r\n","size":4095272},"_npmUser":{"name":"anonymous","email":"npm@invent.life"},"directories":{},"maintainers":[{"name":"anonymous","email":"npm@invent.life"}],"_npmOperationalInternal":{"host":"s3://npm-registry-packages","tmp":"tmp/gpt-tokenizer_1.0.4_1682919853806_0.24985454926294004"},"_hasShrinkwrap":false,"_cnpmcore_publish_time":"2023-05-01T05:44:14.060Z","publish_time":1682919854060,"_source_registry_name":"default"},"1.0.5":{"name":"gpt-tokenizer","version":"1.0.5","description":"BPE Encoder Decoder for GPT-2 / GPT-3","keywords":["BPE","encoder","decoder","tokenizer","GPT","GPT-2","GPT-3","GPT-3.5","GPT-4","NLP","Natural Language Processing","Text Generation","OpenAI","Machine Learning","ml"],"homepage":"https://github.com/niieani/gpt-tokenizer#readme","bugs":{"url":"https://github.com/niieani/gpt-tokenizer/issues"},"repository":{"type":"git","url":"git+https://github.com/niieani/gpt-tokenizer.git"},"license":"MIT","author":{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"},"contributors":[{"name":"Nick Walton","url":"https://github.com/nickwalton"},{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"}],"exports":{".":{"import":"./esm/main.js","require":"./cjs/main.js"},"./*":{"import":"./esm/*.js","require":"./cjs/*.js"},"./cjs":{"require":"./cjs/main.js"},"./cjs/*":{"require":"./cjs/*.js"},"./esm/*":{"import":"./esm/*.js"},"./data/*":{"import":"./data/*","require":"./data/*"},"./package.json":"./package.json"},"main":"esm/main.js","unpkg":"dist/gpt3encoder.js","module":"esm/main.js","source":"src/main.ts","scripts":{"build":"yarn build:cjs && yarn build:esm && yarn build:umd","build:cjs":"yarn rrun tsc --outDir cjs --module commonjs --target es2022","build:esm":"yarn rrun tsc --outDir esm --module esnext --target es2022 && echo '{\"name\": \"gpt-tokenizer\", \"type\": \"module\"}' > ./esm/package.json","build:umd":"beemo webpack --entry='./src/main.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPT3Encoder' --env 'export=default' --env 'filename=gpt3encoder.js'","clean":"git clean -dfX --exclude=node_modules src && beemo typescript:sync-project-refs","format":"yarn rrun prettier --write \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","postinstallDev":"yarn prepare","prepare":"rrun husky install .config/husky && beemo create-config","release":"beemo run-script release","test":"yarn test:format && yarn test:types && yarn test:lint && yarn test:code","test:code":"rrun jest","test:format":"yarn rrun prettier --check \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","test:lint":"rrun eslint 'src/*.{js,jsx,ts,tsx}'","test:types":"yarn rrun tsc --noEmit"},"release":{"branches":["+([0-9])?(.{+([0-9]),x}).x","master",{"name":"main","channel":false},"next",{"name":"beta","prerelease":true},{"name":"alpha","prerelease":true}],"tagFormat":"${version}"},"devDependencies":{"@niieani/scaffold":"^1.6.7","tsx":"^3.12.7"},"packageManager":"yarn@3.5.0","publishConfig":{"access":"public"},"types":"./esm/main.d.ts","gitHead":"f7df7e95eec1a8b23b6f7df3a1b853450f0a08a7","_id":"gpt-tokenizer@1.0.5","_nodeVersion":"16.20.0","_npmVersion":"8.19.4","dist":{"integrity":"sha512-uLq42+uNAJENy1AcVJ3VIIA5BimrcDG8AhbohUro2wYF9nYPq92AlGWe299NqC9pBIY3r+oqwYse0OtIWkJAwg==","shasum":"d34e87e001026734f6321d2cf2ce8710332385c1","tarball":"http://123.232.10.234:8212/nexus/content/groups/npm-public/gpt-tokenizer/-/gpt-tokenizer-1.0.5.tgz","fileCount":51,"unpackedSize":21035096,"signatures":[{"keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA","sig":"MEUCIFW8L743r84Hzd55u5GIJFlIv/aZyjWFlKz3xdk66YQ+AiEA0xQNzT6Mc1zPFSpFuYMT1DR94qv2Yj3FqSLar778hkk="}],"size":5541899},"_npmUser":{"name":"anonymous","email":"npm@invent.life"},"directories":{},"maintainers":[{"name":"anonymous","email":"npm@invent.life"}],"_npmOperationalInternal":{"host":"s3://npm-registry-packages","tmp":"tmp/gpt-tokenizer_1.0.5_1684221576164_0.7559436025514503"},"_hasShrinkwrap":false,"_cnpmcore_publish_time":"2023-05-16T07:19:36.459Z","publish_time":1684221576459,"_source_registry_name":"default"},"2.0.0-beta.1":{"name":"gpt-tokenizer","version":"2.0.0-beta.1","description":"BPE Encoder Decoder for GPT-2 / GPT-3","keywords":["BPE","encoder","decoder","tokenizer","GPT","GPT-2","GPT-3","GPT-3.5","GPT-4","NLP","Natural Language Processing","Text Generation","OpenAI","Machine Learning","ml"],"homepage":"https://github.com/niieani/gpt-tokenizer#readme","bugs":{"url":"https://github.com/niieani/gpt-tokenizer/issues"},"repository":{"type":"git","url":"git+https://github.com/niieani/gpt-tokenizer.git"},"license":"MIT","author":{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"},"contributors":[{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"}],"exports":{".":{"import":"./esm/main.js","require":"./cjs/main.js"},"./*":{"import":"./esm/*.js","require":"./cjs/*.js"},"./cjs":{"require":"./cjs/main.js"},"./cjs/*":{"require":"./cjs/*.js"},"./esm/*":{"import":"./esm/*.js"},"./data/*":{"import":"./data/*","require":"./data/*"},"./package.json":"./package.json"},"main":"esm/main.js","unpkg":"dist/cl100k_base.js","module":"esm/main.js","source":"src/main.ts","scripts":{"build":"yarn build:cjs && yarn build:esm && yarn build:umd","build:cjs":"yarn rrun tsc --outDir cjs --module commonjs --target es2022 --project tsconfig-cjs.json","build:esm":"yarn rrun tsc --outDir esm --module esnext --target es2022 && echo '{\"name\": \"gpt-tokenizer\", \"type\": \"module\"}' > ./esm/package.json","build:umd":"yarn build:umd:cl100k_base && yarn build:umd:p50k_base && yarn build:umd:p50k_edit && yarn build:umd:r50k_base","build:umd:cl100k_base":"beemo webpack --entry='./src/main.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPT3Encoder' --env 'export=default' --env 'filename=cl100k_base.js'","build:umd:p50k_base":"beemo webpack --entry='./src/encoding/p50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPT3Encoder' --env 'export=default' --env 'filename=p50k_base.js'","build:umd:p50k_edit":"beemo webpack --entry='./src/encoding/p50k_edit.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPT3Encoder' --env 'export=default' --env 'filename=p50k_edit.js'","build:umd:r50k_base":"beemo webpack --entry='./src/encoding/r50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPT3Encoder' --env 'export=default' --env 'filename=r50k_base.js'","clean":"git clean -dfX --exclude=node_modules src && beemo typescript:sync-project-refs","format":"yarn rrun prettier --write \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","postinstallDev":"yarn prepare","prepare":"rrun husky install .config/husky && beemo create-config","release":"beemo run-script release","test":"yarn test:format && yarn test:types && yarn test:lint && yarn test:code","test:code":"rrun jest","test:format":"yarn rrun prettier --check \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","test:lint":"rrun eslint 'src/*.{js,jsx,ts,tsx}'","test:types":"yarn rrun tsc --noEmit"},"release":{"branches":["+([0-9])?(.{+([0-9]),x}).x","master",{"name":"main","channel":false},"next",{"name":"beta","prerelease":true},{"name":"alpha","prerelease":true}],"tagFormat":"${version}"},"devDependencies":{"@niieani/scaffold":"^1.7.1","tsx":"^3.12.7"},"packageManager":"yarn@3.5.0","publishConfig":{"access":"public"},"dependencies":{"rfc4648":"^1.5.2"},"types":"./esm/main.d.ts","readmeFilename":"README.md","gitHead":"eedd944628d67f3a4121447bf45aa83022922800","_id":"gpt-tokenizer@2.0.0-beta.1","_nodeVersion":"16.20.0","_npmVersion":"8.19.4","dist":{"integrity":"sha512-4TVJIpQpNWUNazoWrUL2u6Elj/KB2mndtD1S1NV7wyoMA+MgWZ1o5tz85RIi82iMBQXlUz6YVQycq9J8pUOANw==","shasum":"f29ef0e7553f492096c616b58bef1e301625b094","tarball":"http://123.232.10.234:8212/nexus/content/groups/npm-public/gpt-tokenizer/-/gpt-tokenizer-2.0.0-beta.1.tgz","fileCount":374,"unpackedSize":43965878,"signatures":[{"keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA","sig":"MEUCIQCIkE+IXCKkNBhaccDR9ribuwwKmJ/nERfrQ4CMZ7jIfwIgWDf3n6/vhuT1NnY7aRtt/7E/RhsIaB8P89HalIgvRSM="}],"size":10590178},"_npmUser":{"name":"anonymous","email":"npm@invent.life"},"directories":{},"maintainers":[{"name":"anonymous","email":"npm@invent.life"}],"_npmOperationalInternal":{"host":"s3://npm-registry-packages","tmp":"tmp/gpt-tokenizer_2.0.0-beta.1_1684861148642_0.8030317012813002"},"_hasShrinkwrap":false,"_cnpmcore_publish_time":"2023-05-23T16:59:09.105Z","publish_time":1684861149105,"_source_registry_name":"default"},"2.0.0-beta.2":{"name":"gpt-tokenizer","version":"2.0.0-beta.2","description":"BPE Encoder Decoder for GPT-2 / GPT-3","keywords":["BPE","encoder","decoder","tokenizer","GPT","GPT-2","GPT-3","GPT-3.5","GPT-4","NLP","Natural Language Processing","Text Generation","OpenAI","Machine Learning","ml"],"homepage":"https://github.com/niieani/gpt-tokenizer#readme","bugs":{"url":"https://github.com/niieani/gpt-tokenizer/issues"},"repository":{"type":"git","url":"git+https://github.com/niieani/gpt-tokenizer.git"},"license":"MIT","author":{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"},"contributors":[{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"}],"exports":{".":{"import":"./esm/main.js","require":"./cjs/main.js"},"./*":{"import":"./esm/*.js","require":"./cjs/*.js"},"./cjs":{"require":"./cjs/main.js"},"./cjs/*":{"require":"./cjs/*.js"},"./esm/*":{"import":"./esm/*.js"},"./data/*":{"import":"./data/*","require":"./data/*"},"./package.json":"./package.json"},"main":"esm/main.js","unpkg":"dist/cl100k_base.js","module":"esm/main.js","source":"src/main.ts","scripts":{"build":"yarn build:cjs && yarn build:esm && yarn build:umd","build:cjs":"yarn rrun tsc --outDir cjs --module commonjs --target es2022 --project tsconfig-cjs.json","build:esm":"yarn rrun tsc --outDir esm --module esnext --target es2022 && echo '{\"name\": \"gpt-tokenizer\", \"type\": \"module\"}' > ./esm/package.json","build:umd":"yarn build:umd:cl100k_base && yarn build:umd:p50k_base && yarn build:umd:p50k_edit && yarn build:umd:r50k_base","build:umd:cl100k_base":"beemo webpack --entry='./src/main.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer' --env 'export=default' --env 'filename=cl100k_base.js'","build:umd:p50k_base":"beemo webpack --entry='./src/encoding/p50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer' --env 'export=default' --env 'filename=p50k_base.js'","build:umd:p50k_edit":"beemo webpack --entry='./src/encoding/p50k_edit.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer' --env 'export=default' --env 'filename=p50k_edit.js'","build:umd:r50k_base":"beemo webpack --entry='./src/encoding/r50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer' --env 'export=default' --env 'filename=r50k_base.js'","clean":"git clean -dfX --exclude=node_modules src && beemo typescript:sync-project-refs","format":"yarn rrun prettier --write \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","postinstallDev":"yarn prepare","prepare":"rrun husky install .config/husky && beemo create-config","release":"beemo run-script release","test":"yarn test:format && yarn test:types && yarn test:lint && yarn test:code","test:code":"rrun jest","test:format":"yarn rrun prettier --check \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","test:lint":"rrun eslint 'src/*.{js,jsx,ts,tsx}'","test:types":"yarn rrun tsc --noEmit"},"release":{"branches":["+([0-9])?(.{+([0-9]),x}).x","master",{"name":"main","channel":false},"next",{"name":"beta","prerelease":true},{"name":"alpha","prerelease":true}],"tagFormat":"${version}"},"devDependencies":{"@niieani/scaffold":"^1.7.1","tsx":"^3.12.7"},"packageManager":"yarn@3.5.0","publishConfig":{"access":"public"},"dependencies":{"rfc4648":"^1.5.2"},"types":"./esm/main.d.ts","readmeFilename":"README.md","gitHead":"e660a25ab2388416ca027639828b026a0724a5ea","_id":"gpt-tokenizer@2.0.0-beta.2","_nodeVersion":"16.20.0","_npmVersion":"8.19.4","dist":{"integrity":"sha512-Q/4jMBRSaAr4N3OyuxUVAW0n65R0nAt5U63FGP5nYM2dwZvOeSnPi83fszHu/Yth0dBmN2MnMG5qbnSdYTngMA==","shasum":"cc4b4a09f59eece28ae175a7e1c4b089a6daad3a","tarball":"http://123.232.10.234:8212/nexus/content/groups/npm-public/gpt-tokenizer/-/gpt-tokenizer-2.0.0-beta.2.tgz","fileCount":382,"unpackedSize":44062977,"signatures":[{"keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA","sig":"MEUCIDFMQ3QAGOKT1lcd17teZViA5FG1en67W8x1IfbximIhAiEA0ZqbUdweRzOvETDDwRhNW/lU44XJJma9gVzgdnvXf+8="}],"size":10607531},"_npmUser":{"name":"anonymous","email":"npm@invent.life"},"directories":{},"maintainers":[{"name":"anonymous","email":"npm@invent.life"}],"_npmOperationalInternal":{"host":"s3://npm-registry-packages","tmp":"tmp/gpt-tokenizer_2.0.0-beta.2_1684892424642_0.5724685724857297"},"_hasShrinkwrap":false,"_cnpmcore_publish_time":"2023-05-24T01:40:25.015Z","publish_time":1684892425015,"_source_registry_name":"default"},"2.0.0":{"name":"gpt-tokenizer","version":"2.0.0","description":"BPE Encoder Decoder for GPT-2 / GPT-3","keywords":["BPE","encoder","decoder","tokenizer","GPT","GPT-2","GPT-3","GPT-3.5","GPT-4","NLP","Natural Language Processing","Text Generation","OpenAI","Machine Learning","ml"],"homepage":"https://github.com/niieani/gpt-tokenizer#readme","bugs":{"url":"https://github.com/niieani/gpt-tokenizer/issues"},"repository":{"type":"git","url":"git+https://github.com/niieani/gpt-tokenizer.git"},"license":"MIT","author":{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"},"contributors":[{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"}],"exports":{".":{"import":"./esm/main.js","require":"./cjs/main.js"},"./*":{"import":"./esm/*.js","require":"./cjs/*.js"},"./cjs":{"require":"./cjs/main.js"},"./cjs/*":{"require":"./cjs/*.js"},"./esm/*":{"import":"./esm/*.js"},"./data/*":{"import":"./data/*","require":"./data/*"},"./package.json":"./package.json"},"main":"esm/main.js","unpkg":"dist/cl100k_base.js","module":"esm/main.js","source":"src/main.ts","scripts":{"build":"yarn build:cjs && yarn build:esm && yarn build:umd","build:cjs":"yarn rrun tsc --outDir cjs --module commonjs --target es2022 --project tsconfig-cjs.json","build:esm":"yarn rrun tsc --outDir esm --module esnext --target es2022 && echo '{\"name\": \"gpt-tokenizer\", \"type\": \"module\"}' > ./esm/package.json","build:umd":"yarn build:umd:cl100k_base && yarn build:umd:p50k_base && yarn build:umd:p50k_edit && yarn build:umd:r50k_base","build:umd:cl100k_base":"beemo webpack --entry='./src/main.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer' --env 'export=default' --env 'filename=cl100k_base.js'","build:umd:p50k_base":"beemo webpack --entry='./src/encoding/p50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer' --env 'export=default' --env 'filename=p50k_base.js'","build:umd:p50k_edit":"beemo webpack --entry='./src/encoding/p50k_edit.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer' --env 'export=default' --env 'filename=p50k_edit.js'","build:umd:r50k_base":"beemo webpack --entry='./src/encoding/r50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer' --env 'export=default' --env 'filename=r50k_base.js'","clean":"git clean -dfX --exclude=node_modules src && beemo typescript:sync-project-refs","format":"yarn rrun prettier --write \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","postinstallDev":"yarn prepare","prepare":"rrun husky install .config/husky && beemo create-config","release":"beemo run-script release","test":"yarn test:format && yarn test:types && yarn test:lint && yarn test:code","test:code":"rrun jest","test:format":"yarn rrun prettier --check \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","test:lint":"rrun eslint 'src/*.{js,jsx,ts,tsx}'","test:types":"yarn rrun tsc --noEmit"},"release":{"branches":["+([0-9])?(.{+([0-9]),x}).x","master",{"name":"main","channel":false},"next",{"name":"beta","prerelease":true},{"name":"alpha","prerelease":true}],"tagFormat":"${version}"},"devDependencies":{"@niieani/scaffold":"^1.7.1","tsx":"^3.12.7"},"packageManager":"yarn@3.5.0","publishConfig":{"access":"public"},"dependencies":{"rfc4648":"^1.5.2"},"types":"./esm/main.d.ts","gitHead":"774cf367d8b6737376437b9bd9866bc143e8b33a","_id":"gpt-tokenizer@2.0.0","_nodeVersion":"16.20.0","_npmVersion":"8.19.4","dist":{"integrity":"sha512-41odV6Mma0DUvUdfV4Z3F7cWUyXZSXGdP72coAxBhd6rCKZSu2HuPDkE8X1MA3j64h7Vm//T8IDngMimycPEGQ==","shasum":"c63166f7961b046b0754909d7e249095aead6faa","tarball":"http://123.232.10.234:8212/nexus/content/groups/npm-public/gpt-tokenizer/-/gpt-tokenizer-2.0.0.tgz","fileCount":382,"unpackedSize":44062970,"signatures":[{"keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA","sig":"MEUCIEmdMY/1NSC1kfP/GUq9AQ1fM0r60FWLZqvHcL0mAH9ZAiEAx6ZNWrx/WX7CHM+g2wm3qMAOv/qxxcBquc0laOqQcHg="}],"size":10607533},"_npmUser":{"name":"anonymous","email":"npm@invent.life"},"directories":{},"maintainers":[{"name":"anonymous","email":"npm@invent.life"}],"_npmOperationalInternal":{"host":"s3://npm-registry-packages","tmp":"tmp/gpt-tokenizer_2.0.0_1684893824460_0.45900778148376875"},"_hasShrinkwrap":false,"_cnpmcore_publish_time":"2023-05-24T02:03:44.836Z","publish_time":1684893824836,"_source_registry_name":"default"},"2.1.0":{"name":"gpt-tokenizer","version":"2.1.0","description":"A pure JavaScript implementation of a BPE tokenizer (Encoder/Decoder) for GPT-2 / GPT-3 / GPT-4 and other OpenAI models","keywords":["BPE","encoder","decoder","tokenizer","GPT","GPT-2","GPT-3","GPT-3.5","GPT-4","NLP","Natural Language Processing","Text Generation","OpenAI","Machine Learning","ml"],"homepage":"https://github.com/niieani/gpt-tokenizer#readme","bugs":{"url":"https://github.com/niieani/gpt-tokenizer/issues"},"repository":{"type":"git","url":"git+https://github.com/niieani/gpt-tokenizer.git"},"license":"MIT","author":{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"},"contributors":[{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"}],"exports":{".":{"import":"./esm/main.js","require":"./cjs/main.js"},"./*":{"import":"./esm/*.js","require":"./cjs/*.js"},"./cjs":{"require":"./cjs/main.js"},"./cjs/*":{"require":"./cjs/*.js"},"./esm/*":{"import":"./esm/*.js"},"./data/*":{"import":"./data/*","require":"./data/*"},"./package.json":"./package.json"},"main":"esm/main.js","unpkg":"dist/cl100k_base.js","module":"esm/main.js","source":"src/main.ts","scripts":{"codegen:models":"yarn tsx src/codegen/generateByModel.ts","codegen:encodings":"yarn tsx src/codegen/generateJsEncodings.ts","build":"yarn build:cjs && yarn build:esm && yarn build:umd","build:cjs":"yarn rrun tsc --outDir cjs --module commonjs --target es2022 --project tsconfig-cjs.json","build:esm":"yarn rrun tsc --outDir esm --module esnext --target es2022 && echo '{\"name\": \"gpt-tokenizer\", \"type\": \"module\"}' > ./esm/package.json","build:umd":"yarn build:umd:cl100k_base && yarn build:umd:p50k_base && yarn build:umd:p50k_edit && yarn build:umd:r50k_base","build:umd:cl100k_base":"beemo webpack --entry='./src/main.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_cl100k_base' --env 'export=api' --env 'filename=cl100k_base.js'","build:umd:p50k_base":"beemo webpack --entry='./src/encoding/p50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_base' --env 'export=api' --env 'filename=p50k_base.js'","build:umd:p50k_edit":"beemo webpack --entry='./src/encoding/p50k_edit.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_edit' --env 'export=api' --env 'filename=p50k_edit.js'","build:umd:r50k_base":"beemo webpack --entry='./src/encoding/r50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_r50k_base' --env 'export=api' --env 'filename=r50k_base.js'","clean":"git clean -dfX --exclude=node_modules src && beemo typescript:sync-project-refs","format":"yarn rrun prettier --write \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","postinstallDev":"yarn prepare","prepare":"rrun husky install .config/husky && beemo create-config","release":"beemo run-script release","test":"yarn test:format && yarn test:types && yarn test:lint && yarn test:code","test:code":"rrun jest","test:format":"yarn rrun prettier --check \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","test:lint":"rrun eslint 'src/*.{js,jsx,ts,tsx}'","test:types":"yarn rrun tsc --noEmit"},"release":{"branches":["+([0-9])?(.{+([0-9]),x}).x","master",{"name":"main","channel":false},"next",{"name":"beta","prerelease":true},{"name":"alpha","prerelease":true}],"tagFormat":"${version}"},"devDependencies":{"@niieani/scaffold":"^1.7.1","tsx":"^3.12.7"},"packageManager":"yarn@3.5.0","publishConfig":{"access":"public"},"dependencies":{"rfc4648":"^1.5.2"},"types":"./esm/main.d.ts","gitHead":"71af9d3415f903c304e9bbf4c8feef33acd48e02","_id":"gpt-tokenizer@2.1.0","_nodeVersion":"16.20.0","_npmVersion":"8.19.4","dist":{"integrity":"sha512-WKlLVBjY60XapsrczNJXlXvSVVVuaCL0bn1zbpDVXyBwrq4m8YDSYVRHZis7hvm7yI+CKwvWSM0lR1TMUjKLcA==","shasum":"38f38bf1bea4b69b813952c2c1dde82d3b2377f9","tarball":"http://123.232.10.234:8212/nexus/content/groups/npm-public/gpt-tokenizer/-/gpt-tokenizer-2.1.0.tgz","fileCount":418,"unpackedSize":44213636,"signatures":[{"keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA","sig":"MEYCIQDdTQEAH+1U6LWHdFViV9ZP13+WCb42HH83FztQbONS/AIhALdcd2W61JiYxUrlXCdkv5ZJuWIpWnSaZ2tMSdPnY7l4"}],"size":10626575},"_npmUser":{"name":"anonymous","email":"npm@invent.life"},"directories":{},"maintainers":[{"name":"anonymous","email":"npm@invent.life"}],"_npmOperationalInternal":{"host":"s3://npm-registry-packages","tmp":"tmp/gpt-tokenizer_2.1.0_1685605327968_0.008176279729137592"},"_hasShrinkwrap":false,"_cnpmcore_publish_time":"2023-06-01T07:42:08.442Z","publish_time":1685605328442,"_source_registry_name":"default"},"2.1.1":{"name":"gpt-tokenizer","version":"2.1.1","description":"A pure JavaScript implementation of a BPE tokenizer (Encoder/Decoder) for GPT-2 / GPT-3 / GPT-4 and other OpenAI models","keywords":["BPE","encoder","decoder","tokenizer","GPT","GPT-2","GPT-3","GPT-3.5","GPT-4","NLP","Natural Language Processing","Text Generation","OpenAI","Machine Learning","ml"],"homepage":"https://github.com/niieani/gpt-tokenizer#readme","bugs":{"url":"https://github.com/niieani/gpt-tokenizer/issues"},"repository":{"type":"git","url":"git+https://github.com/niieani/gpt-tokenizer.git"},"license":"MIT","author":{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"},"contributors":[{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"}],"exports":{".":{"import":"./esm/main.js","require":"./cjs/main.js"},"./*":{"import":"./esm/*.js","require":"./cjs/*.js"},"./cjs":{"require":"./cjs/main.js"},"./cjs/*":{"require":"./cjs/*.js"},"./esm/*":{"import":"./esm/*.js"},"./data/*":{"import":"./data/*","require":"./data/*"},"./package.json":"./package.json"},"main":"esm/main.js","unpkg":"dist/cl100k_base.js","module":"esm/main.js","source":"src/main.ts","scripts":{"codegen:models":"yarn tsx src/codegen/generateByModel.ts","codegen:encodings":"yarn tsx src/codegen/generateJsEncodings.ts","build":"yarn build:cjs && yarn build:esm && yarn build:umd","build:cjs":"yarn rrun tsc --outDir cjs --module commonjs --target es2022 --project tsconfig-cjs.json","build:esm":"yarn rrun tsc --outDir esm --module esnext --target es2022 && echo '{\"name\": \"gpt-tokenizer\", \"type\": \"module\"}' > ./esm/package.json","build:umd":"yarn build:umd:cl100k_base && yarn build:umd:p50k_base && yarn build:umd:p50k_edit && yarn build:umd:r50k_base","build:umd:cl100k_base":"beemo webpack --entry='./src/main.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_cl100k_base' --env 'filename=cl100k_base.js'","build:umd:p50k_base":"beemo webpack --entry='./src/encoding/p50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_base' --env 'filename=p50k_base.js'","build:umd:p50k_edit":"beemo webpack --entry='./src/encoding/p50k_edit.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_edit' --env 'filename=p50k_edit.js'","build:umd:r50k_base":"beemo webpack --entry='./src/encoding/r50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_r50k_base' --env 'filename=r50k_base.js'","clean":"git clean -dfX --exclude=node_modules src && beemo typescript:sync-project-refs","format":"yarn rrun prettier --write \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","postinstallDev":"yarn prepare","prepare":"rrun husky install .config/husky && beemo create-config","release":"beemo run-script release","test":"yarn test:format && yarn test:types && yarn test:lint && yarn test:code","test:code":"rrun jest","test:format":"yarn rrun prettier --check \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","test:lint":"rrun eslint 'src/*.{js,jsx,ts,tsx}'","test:types":"yarn rrun tsc --noEmit"},"release":{"branches":["+([0-9])?(.{+([0-9]),x}).x","master",{"name":"main","channel":false},"next",{"name":"beta","prerelease":true},{"name":"alpha","prerelease":true}],"tagFormat":"${version}"},"devDependencies":{"@niieani/scaffold":"^1.7.1","tsx":"^3.12.7"},"packageManager":"yarn@3.5.0","publishConfig":{"access":"public"},"dependencies":{"rfc4648":"^1.5.2"},"types":"./esm/main.d.ts","gitHead":"2a55474f9725f6907a5f17fa68cd13d76e8d2f9d","_id":"gpt-tokenizer@2.1.1","_nodeVersion":"16.20.0","_npmVersion":"8.19.4","dist":{"integrity":"sha512-WlX+vj6aPaZ71U6Bf18fem+5k58zlgh2a4nbc7KHy6aGVIyq3nCh709b/8momu34sV/5t/SpzWi8LayWD9uyDw==","shasum":"46df6234398f812d14c1cf3541e289c51da7acab","tarball":"http://123.232.10.234:8212/nexus/content/groups/npm-public/gpt-tokenizer/-/gpt-tokenizer-2.1.1.tgz","fileCount":418,"unpackedSize":44240647,"signatures":[{"keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA","sig":"MEUCIDLp5LK0topAwFNy1yXQWQo0mLdu5zIaonFQe8kSdtyaAiEAnHYJjG+R4uxvSzEKqbFjnzv6OmDxriFQYinUUb6Rpug="}],"size":10627145},"_npmUser":{"name":"anonymous","email":"npm@invent.life"},"directories":{},"maintainers":[{"name":"anonymous","email":"npm@invent.life"}],"_npmOperationalInternal":{"host":"s3://npm-registry-packages","tmp":"tmp/gpt-tokenizer_2.1.1_1685606743559_0.753794140659116"},"_hasShrinkwrap":false,"_cnpmcore_publish_time":"2023-06-01T08:05:44.064Z","publish_time":1685606744064,"_source_registry_name":"default"},"2.1.2":{"name":"gpt-tokenizer","version":"2.1.2","description":"A pure JavaScript implementation of a BPE tokenizer (Encoder/Decoder) for GPT-2 / GPT-3 / GPT-4 and other OpenAI models","keywords":["BPE","encoder","decoder","tokenizer","GPT","GPT-2","GPT-3","GPT-3.5","GPT-4","NLP","Natural Language Processing","Text Generation","OpenAI","Machine Learning","ml"],"homepage":"https://github.com/niieani/gpt-tokenizer#readme","bugs":{"url":"https://github.com/niieani/gpt-tokenizer/issues"},"repository":{"type":"git","url":"git+https://github.com/niieani/gpt-tokenizer.git"},"license":"MIT","author":{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"},"contributors":[{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"}],"exports":{".":{"import":"./esm/main.js","require":"./cjs/main.js"},"./*":{"import":"./esm/*.js","require":"./cjs/*.js"},"./cjs":{"require":"./cjs/main.js"},"./cjs/*":{"require":"./cjs/*.js"},"./esm/*":{"import":"./esm/*.js"},"./data/*":{"import":"./data/*","require":"./data/*"},"./package.json":"./package.json"},"main":"esm/main.js","unpkg":"dist/cl100k_base.js","module":"esm/main.js","source":"src/main.ts","scripts":{"codegen:models":"yarn tsx src/codegen/generateByModel.ts","codegen:encodings":"yarn tsx src/codegen/generateJsEncodings.ts","build":"yarn build:cjs && yarn build:esm && yarn build:umd","build:cjs":"yarn rrun tsc --outDir cjs --module commonjs --target es2022 --project tsconfig-cjs.json","build:esm":"yarn rrun tsc --outDir esm --module esnext --target es2022 && echo '{\"name\": \"gpt-tokenizer\", \"type\": \"module\"}' > ./esm/package.json","build:umd":"yarn build:umd:cl100k_base && yarn build:umd:p50k_base && yarn build:umd:p50k_edit && yarn build:umd:r50k_base","build:umd:cl100k_base":"beemo webpack --entry='./src/main.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_cl100k_base' --env 'filename=cl100k_base.js'","build:umd:p50k_base":"beemo webpack --entry='./src/encoding/p50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_base' --env 'filename=p50k_base.js'","build:umd:p50k_edit":"beemo webpack --entry='./src/encoding/p50k_edit.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_edit' --env 'filename=p50k_edit.js'","build:umd:r50k_base":"beemo webpack --entry='./src/encoding/r50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_r50k_base' --env 'filename=r50k_base.js'","clean":"git clean -dfX --exclude=node_modules src && beemo typescript:sync-project-refs","format":"yarn rrun prettier --write \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","postinstallDev":"yarn prepare","prepare":"rrun husky install .config/husky && beemo create-config","release":"beemo run-script release","test":"yarn test:format && yarn test:types && yarn test:lint && yarn test:code","test:code":"rrun jest","test:format":"yarn rrun prettier --check \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","test:lint":"rrun eslint 'src/*.{js,jsx,ts,tsx}'","test:types":"yarn rrun tsc --noEmit"},"release":{"branches":["+([0-9])?(.{+([0-9]),x}).x","master",{"name":"main","channel":false},"next",{"name":"beta","prerelease":true},{"name":"alpha","prerelease":true}],"tagFormat":"${version}"},"devDependencies":{"@niieani/scaffold":"^1.7.1","tsx":"^3.12.7"},"packageManager":"yarn@3.5.0","publishConfig":{"access":"public"},"dependencies":{"rfc4648":"^1.5.2"},"types":"./esm/main.d.ts","gitHead":"86c270cb6a27600149f9bd56f675628109c4a134","_id":"gpt-tokenizer@2.1.2","_nodeVersion":"16.20.2","_npmVersion":"8.19.4","dist":{"integrity":"sha512-HSuI5d6uey+c7x/VzQlPfCoGrfLyAc28vxWofKbjR9PJHm0AjQGSWkKw/OJnb+8S1g7nzgRsf0WH3dK+NNWYbg==","shasum":"14f7ce424cf2309fb5be66e112d1836080c2791a","tarball":"http://123.232.10.234:8212/nexus/content/groups/npm-public/gpt-tokenizer/-/gpt-tokenizer-2.1.2.tgz","fileCount":418,"unpackedSize":44242551,"signatures":[{"keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA","sig":"MEYCIQCBzM54fVmPRsZcDvvoVXNgIe0RVnh7KTC30/xSDPM1dAIhANd9qzdvgRMSlQI3mCdYJFmNfNb+HlDVCaKLuq/uI+y4"}],"size":10627412},"_npmUser":{"name":"anonymous","email":"npm@invent.life"},"directories":{},"maintainers":[{"name":"anonymous","email":"npm@invent.life"}],"_npmOperationalInternal":{"host":"s3://npm-registry-packages","tmp":"tmp/gpt-tokenizer_2.1.2_1696640244767_0.883231547392024"},"_hasShrinkwrap":false,"_cnpmcore_publish_time":"2023-10-07T00:57:25.208Z","publish_time":1696640245208,"_source_registry_name":"default"},"2.2.0":{"name":"gpt-tokenizer","version":"2.2.0","description":"A pure JavaScript implementation of a BPE tokenizer (Encoder/Decoder) for GPT-2 / GPT-3 / GPT-4 and other OpenAI models","keywords":["BPE","encoder","decoder","tokenizer","GPT","GPT-2","GPT-3","GPT-3.5","GPT-4","GPT-4o","NLP","Natural Language Processing","Text Generation","OpenAI","Machine Learning","ml"],"homepage":"https://github.com/niieani/gpt-tokenizer#readme","bugs":{"url":"https://github.com/niieani/gpt-tokenizer/issues"},"repository":{"type":"git","url":"git+https://github.com/niieani/gpt-tokenizer.git"},"license":"MIT","author":{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"},"contributors":[{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"}],"exports":{".":{"import":"./esm/main.js","require":"./cjs/main.js"},"./*":{"import":"./esm/*.js","require":"./cjs/*.js"},"./cjs":{"require":"./cjs/main.js"},"./cjs/*":{"require":"./cjs/*.js"},"./esm/*":{"import":"./esm/*.js"},"./data/*":{"import":"./data/*","require":"./data/*"},"./package.json":"./package.json"},"main":"esm/main.js","unpkg":"dist/cl100k_base.js","module":"esm/main.js","source":"src/main.ts","scripts":{"codegen:models":"yarn tsx src/codegen/generateByModel.ts","codegen:encodings":"yarn tsx src/codegen/generateJsEncodings.ts","build":"yarn build:cjs && yarn build:esm && yarn build:umd","build:cjs":"yarn rrun tsc --outDir cjs --module commonjs --target es2022 --project tsconfig-cjs.json","build:esm":"yarn rrun tsc --outDir esm --module esnext --target es2022 && echo '{\"name\": \"gpt-tokenizer\", \"type\": \"module\"}' > ./esm/package.json","build:umd":"yarn build:umd:cl100k_base && yarn build:umd:p50k_base && yarn build:umd:p50k_edit && yarn build:umd:r50k_base && yarn build:umd:o200k_base","build:umd:cl100k_base":"beemo webpack --entry='./src/main.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_cl100k_base' --env 'filename=cl100k_base.js'","build:umd:p50k_base":"beemo webpack --entry='./src/encoding/p50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_base' --env 'filename=p50k_base.js'","build:umd:p50k_edit":"beemo webpack --entry='./src/encoding/p50k_edit.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_edit' --env 'filename=p50k_edit.js'","build:umd:r50k_base":"beemo webpack --entry='./src/encoding/r50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_r50k_base' --env 'filename=r50k_base.js'","build:umd:o200k_base":"beemo webpack --entry='./src/encoding/o200k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_o200k_base' --env 'filename=o200k_base.js'","clean":"git clean -dfX --exclude=node_modules src && beemo typescript:sync-project-refs","format":"yarn rrun prettier --write \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","postinstallDev":"yarn prepare","prepare":"rrun husky install .config/husky && beemo create-config","release":"beemo run-script release","test":"yarn test:format && yarn test:types && yarn test:lint && yarn test:code","test:code":"rrun jest","test:format":"yarn rrun prettier --check \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","test:lint":"rrun eslint 'src/*.{js,jsx,ts,tsx}'","test:types":"yarn rrun tsc --noEmit"},"release":{"branches":["+([0-9])?(.{+([0-9]),x}).x","master",{"name":"main","channel":false},"next",{"name":"beta","prerelease":true},{"name":"alpha","prerelease":true}],"tagFormat":"${version}"},"devDependencies":{"@niieani/scaffold":"^1.7.1","tsx":"^3.12.7"},"packageManager":"yarn@3.5.0","publishConfig":{"access":"public"},"dependencies":{"rfc4648":"^1.5.2"},"types":"./esm/main.d.ts","gitHead":"5c51087ad639d48bbc3a5837ca48db1f75be32da","_id":"gpt-tokenizer@2.2.0","_nodeVersion":"16.20.2","_npmVersion":"8.19.4","dist":{"integrity":"sha512-r1YZP3+PgFyzuOQAL2wTnKe6alAzAIj31VOS36UyiU6ebM/x4tZzylFBHHDQxGr7VjcbJfEMQ6b6/UNEfSJIGw==","shasum":"976037cf5338295a09ce861890bc29657ceed183","tarball":"http://123.232.10.234:8212/nexus/content/groups/npm-public/gpt-tokenizer/-/gpt-tokenizer-2.2.0.tgz","fileCount":441,"unpackedSize":82842246,"signatures":[{"keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA","sig":"MEYCIQDKh0A1HLeFm1rhW9vK1FAFESe1DSfKad+hwAqc1+jQ9gIhALJryZleeQxuskASxJXK/21lDJigXTw3AQlaO/O3uiVs"}],"size":19760195},"_npmUser":{"name":"anonymous","email":"npm@invent.life"},"directories":{},"maintainers":[{"name":"anonymous","email":"npm@invent.life"}],"_npmOperationalInternal":{"host":"s3://npm-registry-packages","tmp":"tmp/gpt-tokenizer_2.2.0_1721257470089_0.21932303532491848"},"_hasShrinkwrap":false,"_cnpmcore_publish_time":"2024-07-17T23:04:30.554Z","publish_time":1721257470554,"_source_registry_name":"default"},"2.2.1":{"name":"gpt-tokenizer","version":"2.2.1","description":"A pure JavaScript implementation of a BPE tokenizer (Encoder/Decoder) for GPT-2 / GPT-3 / GPT-4 and other OpenAI models","keywords":["BPE","encoder","decoder","tokenizer","GPT","GPT-2","GPT-3","GPT-3.5","GPT-4","GPT-4o","NLP","Natural Language Processing","Text Generation","OpenAI","Machine Learning","ml"],"homepage":"https://github.com/niieani/gpt-tokenizer#readme","bugs":{"url":"https://github.com/niieani/gpt-tokenizer/issues"},"repository":{"type":"git","url":"git+https://github.com/niieani/gpt-tokenizer.git"},"license":"MIT","author":{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"},"contributors":[{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"}],"exports":{".":{"import":"./esm/main.js","require":"./cjs/main.js"},"./*":{"import":"./esm/*.js","require":"./cjs/*.js"},"./cjs":{"require":"./cjs/main.js"},"./cjs/*":{"require":"./cjs/*.js"},"./esm/*":{"import":"./esm/*.js"},"./data/*":{"import":"./data/*","require":"./data/*"},"./package.json":"./package.json"},"main":"esm/main.js","unpkg":"dist/cl100k_base.js","module":"esm/main.js","source":"src/main.ts","scripts":{"codegen:models":"yarn tsx src/codegen/generateByModel.ts","codegen:encodings":"yarn tsx src/codegen/generateJsEncodings.ts","build":"yarn build:cjs && yarn build:esm && yarn build:umd","build:cjs":"yarn rrun tsc --outDir cjs --module commonjs --target es2022 --project tsconfig-cjs.json","build:esm":"yarn rrun tsc --outDir esm --module esnext --target es2022 && echo '{\"name\": \"gpt-tokenizer\", \"type\": \"module\"}' > ./esm/package.json","build:umd":"yarn build:umd:cl100k_base && yarn build:umd:p50k_base && yarn build:umd:p50k_edit && yarn build:umd:r50k_base && yarn build:umd:o200k_base","build:umd:cl100k_base":"beemo webpack --entry='./src/main.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_cl100k_base' --env 'filename=cl100k_base.js'","build:umd:p50k_base":"beemo webpack --entry='./src/encoding/p50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_base' --env 'filename=p50k_base.js'","build:umd:p50k_edit":"beemo webpack --entry='./src/encoding/p50k_edit.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_edit' --env 'filename=p50k_edit.js'","build:umd:r50k_base":"beemo webpack --entry='./src/encoding/r50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_r50k_base' --env 'filename=r50k_base.js'","build:umd:o200k_base":"beemo webpack --entry='./src/encoding/o200k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_o200k_base' --env 'filename=o200k_base.js'","clean":"git clean -dfX --exclude=node_modules src && beemo typescript:sync-project-refs","format":"yarn rrun prettier --write \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","postinstallDev":"yarn prepare","prepare":"rrun husky install .config/husky && beemo create-config","release":"beemo run-script release","test":"yarn test:format && yarn test:types && yarn test:lint && yarn test:code","test:code":"rrun jest","test:format":"yarn rrun prettier --check \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","test:lint":"rrun eslint 'src/*.{js,jsx,ts,tsx}'","test:types":"yarn rrun tsc --noEmit"},"release":{"branches":["+([0-9])?(.{+([0-9]),x}).x","master",{"name":"main","channel":false},"next",{"name":"beta","prerelease":true},{"name":"alpha","prerelease":true}],"tagFormat":"${version}"},"devDependencies":{"@niieani/scaffold":"^1.7.1","tsx":"^3.12.7"},"packageManager":"yarn@3.5.0","publishConfig":{"access":"public"},"dependencies":{"rfc4648":"^1.5.2"},"types":"./esm/main.d.ts","gitHead":"3d01a89a1bb07cb776f2c9c3028c63d22f0f98ae","_id":"gpt-tokenizer@2.2.1","_nodeVersion":"16.20.2","_npmVersion":"8.19.4","dist":{"integrity":"sha512-JYvLWTpPtFGz7eS7uixHslv3L96zka0n18MlQeH5YVl5F6mNhssxzSBTeqwNfW8A0AQIMYEaOfbSr+MaoCUvpg==","shasum":"195edebcc60b20c76f25075ffc87cb6b74cbdf3f","tarball":"http://123.232.10.234:8212/nexus/content/groups/npm-public/gpt-tokenizer/-/gpt-tokenizer-2.2.1.tgz","fileCount":456,"unpackedSize":86488441,"signatures":[{"keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA","sig":"MEYCIQD5Bu922EvQ0QUFtwSMvFD2K6TxPAf6ni2X3loV+WZz3QIhAMY7TCniwQgY6YPEnHWr9wJ91jAXnKxjfVqlxJf2BJ+B"}],"size":21453830},"_npmUser":{"name":"anonymous","email":"npm@invent.life"},"directories":{},"maintainers":[{"name":"anonymous","email":"npm@invent.life"}],"_npmOperationalInternal":{"host":"s3://npm-registry-packages","tmp":"tmp/gpt-tokenizer_2.2.1_1721268077544_0.968778084303128"},"_hasShrinkwrap":false,"_cnpmcore_publish_time":"2024-07-18T02:01:18.120Z","publish_time":1721268078120,"_source_registry_name":"default"},"2.2.2":{"name":"gpt-tokenizer","version":"2.2.2","description":"A pure JavaScript implementation of a BPE tokenizer (Encoder/Decoder) for GPT-2 / GPT-3 / GPT-4 and other OpenAI models","keywords":["BPE","encoder","decoder","tokenizer","GPT","GPT-2","GPT-3","GPT-3.5","GPT-4","GPT-4o","NLP","Natural Language Processing","Text Generation","OpenAI","Machine Learning","ml"],"homepage":"https://github.com/niieani/gpt-tokenizer#readme","bugs":{"url":"https://github.com/niieani/gpt-tokenizer/issues"},"repository":{"type":"git","url":"git+https://github.com/niieani/gpt-tokenizer.git"},"license":"MIT","author":{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"},"contributors":[{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"}],"exports":{".":{"import":"./esm/main.js","require":"./cjs/main.js"},"./*":{"import":"./esm/*.js","require":"./cjs/*.js"},"./cjs":{"require":"./cjs/main.js"},"./cjs/*":{"require":"./cjs/*.js"},"./esm/*":{"import":"./esm/*.js"},"./data/*":{"import":"./data/*","require":"./data/*"},"./package.json":"./package.json"},"main":"esm/main.js","unpkg":"dist/cl100k_base.js","module":"esm/main.js","source":"src/main.ts","scripts":{"codegen:models":"yarn tsx src/codegen/generateByModel.ts","codegen:encodings":"yarn tsx src/codegen/generateJsEncodings.ts","build":"yarn build:cjs && yarn build:esm && yarn build:umd","build:cjs":"yarn rrun tsc --outDir cjs --module commonjs --target es2022 --project tsconfig-cjs.json","build:esm":"yarn rrun tsc --outDir esm --module esnext --target es2022 && echo '{\"name\": \"gpt-tokenizer\", \"type\": \"module\"}' > ./esm/package.json","build:umd":"yarn build:umd:cl100k_base && yarn build:umd:p50k_base && yarn build:umd:p50k_edit && yarn build:umd:r50k_base && yarn build:umd:o200k_base","build:umd:cl100k_base":"beemo webpack --entry='./src/main.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_cl100k_base' --env 'filename=cl100k_base.js'","build:umd:p50k_base":"beemo webpack --entry='./src/encoding/p50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_base' --env 'filename=p50k_base.js'","build:umd:p50k_edit":"beemo webpack --entry='./src/encoding/p50k_edit.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_edit' --env 'filename=p50k_edit.js'","build:umd:r50k_base":"beemo webpack --entry='./src/encoding/r50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_r50k_base' --env 'filename=r50k_base.js'","build:umd:o200k_base":"beemo webpack --entry='./src/encoding/o200k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_o200k_base' --env 'filename=o200k_base.js'","clean":"git clean -dfX --exclude=node_modules src && beemo typescript:sync-project-refs","format":"yarn rrun prettier --write \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","postinstallDev":"yarn prepare","prepare":"rrun husky install .config/husky && beemo create-config","release":"beemo run-script release","test":"yarn test:format && yarn test:types && yarn test:lint && yarn test:code","test:code":"rrun jest","test:format":"yarn rrun prettier --check \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","test:lint":"rrun eslint 'src/*.{js,jsx,ts,tsx}'","test:types":"yarn rrun tsc --noEmit"},"release":{"branches":["+([0-9])?(.{+([0-9]),x}).x","master",{"name":"main","channel":false},"next",{"name":"beta","prerelease":true},{"name":"alpha","prerelease":true}],"tagFormat":"${version}"},"devDependencies":{"@niieani/scaffold":"^1.7.32","tsx":"^4.19.1","typescript":"^5.6.2"},"resolutions":{"typescript":"5.6.2"},"packageManager":"yarn@4.5.0","publishConfig":{"access":"public"},"dependencies":{"rfc4648":"^1.5.3"},"types":"./esm/main.d.ts","gitHead":"f285cc1abbeb515fc66142c8b3155d292f15584c","_id":"gpt-tokenizer@2.2.2","_nodeVersion":"22.8.0","_npmVersion":"8.19.4","dist":{"integrity":"sha512-zob3jMtO/z7HJNF73rnknjAX7q4EaI7XnVlIAsFgl0/dvXrHiArnaIgfDzk73t/YRSR2iee70eFn8m3fMafdVw==","shasum":"e289690445ff0cb9b1e8ac97d25c8c673473ee2e","tarball":"http://123.232.10.234:8212/nexus/content/groups/npm-public/gpt-tokenizer/-/gpt-tokenizer-2.2.2.tgz","fileCount":456,"unpackedSize":86467851,"signatures":[{"keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA","sig":"MEUCIQChWG5bV0dclskWA4bzgkMLldPKAdTCwj18GG2rYds0awIgdYiJ70W6v4M42W0bzo15/ZBWv/NskYcsNFQwrRdB+gM="}],"size":21433064},"_npmUser":{"name":"anonymous","email":"npm@invent.life"},"directories":{},"maintainers":[{"name":"anonymous","email":"npm@invent.life"}],"_npmOperationalInternal":{"host":"s3://npm-registry-packages","tmp":"tmp/gpt-tokenizer_2.2.2_1726441876378_0.26738304260864965"},"_hasShrinkwrap":false,"_cnpmcore_publish_time":"2024-09-15T23:11:16.857Z","publish_time":1726441876857,"_source_registry_name":"default"},"2.2.3":{"name":"gpt-tokenizer","version":"2.2.3","description":"A pure JavaScript implementation of a BPE tokenizer (Encoder/Decoder) for GPT-2 / GPT-3 / GPT-4 and other OpenAI models","keywords":["BPE","encoder","decoder","tokenizer","GPT","GPT-2","GPT-3","GPT-3.5","GPT-4","GPT-4o","NLP","Natural Language Processing","Text Generation","OpenAI","Machine Learning","ml"],"homepage":"https://github.com/niieani/gpt-tokenizer#readme","bugs":{"url":"https://github.com/niieani/gpt-tokenizer/issues"},"repository":{"type":"git","url":"git+https://github.com/niieani/gpt-tokenizer.git"},"license":"MIT","author":{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"},"contributors":[{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"}],"exports":{".":{"import":"./esm/main.js","require":"./cjs/main.js"},"./*":{"import":"./esm/*.js","require":"./cjs/*.js"},"./cjs":{"require":"./cjs/main.js"},"./cjs/*":{"require":"./cjs/*.js"},"./esm/*":{"import":"./esm/*.js"},"./data/*":{"import":"./data/*","require":"./data/*"},"./package.json":"./package.json"},"main":"esm/main.js","unpkg":"dist/cl100k_base.js","module":"esm/main.js","source":"src/main.ts","scripts":{"codegen:models":"yarn tsx src/codegen/generateByModel.ts","codegen:encodings":"yarn tsx src/codegen/generateJsEncodings.ts","build":"yarn build:cjs && yarn build:esm && yarn build:umd","build:cjs":"yarn rrun tsc --outDir cjs --module commonjs --target es2022 --project tsconfig-cjs.json","build:esm":"yarn rrun tsc --outDir esm --module esnext --target es2022 && echo '{\"name\": \"gpt-tokenizer\", \"type\": \"module\"}' > ./esm/package.json","build:umd":"yarn build:umd:cl100k_base && yarn build:umd:p50k_base && yarn build:umd:p50k_edit && yarn build:umd:r50k_base && yarn build:umd:o200k_base","build:umd:cl100k_base":"beemo webpack --entry='./src/main.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_cl100k_base' --env 'filename=cl100k_base.js'","build:umd:p50k_base":"beemo webpack --entry='./src/encoding/p50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_base' --env 'filename=p50k_base.js'","build:umd:p50k_edit":"beemo webpack --entry='./src/encoding/p50k_edit.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_edit' --env 'filename=p50k_edit.js'","build:umd:r50k_base":"beemo webpack --entry='./src/encoding/r50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_r50k_base' --env 'filename=r50k_base.js'","build:umd:o200k_base":"beemo webpack --entry='./src/encoding/o200k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_o200k_base' --env 'filename=o200k_base.js'","clean":"git clean -dfX --exclude=node_modules src && beemo typescript:sync-project-refs","format":"yarn rrun prettier --write \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","postinstallDev":"yarn prepare","prepare":"rrun husky install .config/husky && beemo create-config","release":"beemo run-script release","test":"yarn test:format && yarn test:types && yarn test:lint && yarn test:code","test:code":"rrun jest","test:format":"yarn rrun prettier --check \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","test:lint":"rrun eslint 'src/*.{js,jsx,ts,tsx}'","test:types":"yarn rrun tsc --noEmit"},"release":{"branches":["+([0-9])?(.{+([0-9]),x}).x","master",{"name":"main","channel":false},"next",{"name":"beta","prerelease":true},{"name":"alpha","prerelease":true}],"tagFormat":"${version}"},"devDependencies":{"@niieani/scaffold":"^1.7.32","tsx":"^4.19.1","typescript":"^5.6.2"},"resolutions":{"typescript":"5.6.2"},"packageManager":"yarn@4.5.0","publishConfig":{"access":"public"},"dependencies":{"rfc4648":"^1.5.3"},"types":"./esm/main.d.ts","gitHead":"fcbf48a553dcc4d6e7b617374880736070d16882","_id":"gpt-tokenizer@2.2.3","_nodeVersion":"22.8.0","_npmVersion":"8.19.4","dist":{"integrity":"sha512-EOHvuE+J/sDw36QSWYX3d9fypAPMDvevi/W2XW0Bh+n76Iq3yHuNMHXXe5VmSQfcxIC9CVqyZgPOSxgjgAyQtQ==","shasum":"97ce5505f151eb2eff2f6c2a37b0fe6d9f09df36","tarball":"http://123.232.10.234:8212/nexus/content/groups/npm-public/gpt-tokenizer/-/gpt-tokenizer-2.2.3.tgz","fileCount":456,"unpackedSize":86467851,"signatures":[{"keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA","sig":"MEYCIQDDZQIdRpKpclMbwT3VLko67t56Y7EGpH+Kh98AtWmbuQIhAK1b7AZOyCSCT3yO94cOfxCrRPMhOJKT/fDW+PasuNC7"}],"size":21433059},"_npmUser":{"name":"anonymous","email":"npm@invent.life"},"directories":{},"maintainers":[{"name":"anonymous","email":"npm@invent.life"}],"_npmOperationalInternal":{"host":"s3://npm-registry-packages","tmp":"tmp/gpt-tokenizer_2.2.3_1726460849034_0.13157198700303452"},"_hasShrinkwrap":false,"_cnpmcore_publish_time":"2024-09-16T04:27:29.442Z","publish_time":1726460849442,"_source_registry_name":"default"},"2.3.0":{"name":"gpt-tokenizer","version":"2.3.0","description":"A pure JavaScript implementation of a BPE tokenizer (Encoder/Decoder) for GPT-2 / GPT-3 / GPT-4 and other OpenAI models","keywords":["BPE","encoder","decoder","tokenizer","GPT","GPT-2","GPT-3","GPT-3.5","GPT-4","GPT-4o","NLP","Natural Language Processing","Text Generation","OpenAI","Machine Learning","ml"],"homepage":"https://github.com/niieani/gpt-tokenizer#readme","bugs":{"url":"https://github.com/niieani/gpt-tokenizer/issues"},"repository":{"type":"git","url":"git+https://github.com/niieani/gpt-tokenizer.git"},"license":"MIT","author":{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"},"contributors":[{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"}],"exports":{".":{"import":"./esm/main.js","require":"./cjs/main.js"},"./*":{"import":"./esm/*.js","require":"./cjs/*.js"},"./cjs":{"require":"./cjs/main.js"},"./cjs/*":{"require":"./cjs/*.js"},"./esm/*":{"import":"./esm/*.js"},"./data/*":{"import":"./data/*","require":"./data/*"},"./package.json":"./package.json"},"main":"esm/main.js","unpkg":"dist/cl100k_base.js","module":"esm/main.js","source":"src/main.ts","scripts":{"codegen:models":"yarn tsx src/codegen/generateByModel.ts","codegen:encodings":"yarn tsx src/codegen/generateJsEncodings.ts","build":"yarn build:cjs && yarn build:esm && yarn build:umd","build:cjs":"yarn rrun tsc --outDir cjs --module commonjs --target es2022 --project tsconfig-cjs.json","build:esm":"mkdir -p esm && echo '{\"name\": \"gpt-tokenizer\", \"type\": \"module\"}' > ./esm/package.json && yarn rrun tsc --outDir esm --module esnext --target es2022","build:umd":"yarn build:umd:cl100k_base && yarn build:umd:p50k_base && yarn build:umd:p50k_edit && yarn build:umd:r50k_base && yarn build:umd:o200k_base","build:umd:cl100k_base":"beemo webpack --entry='./src/main.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_cl100k_base' --env 'filename=cl100k_base.js'","build:umd:p50k_base":"beemo webpack --entry='./src/encoding/p50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_base' --env 'filename=p50k_base.js'","build:umd:p50k_edit":"beemo webpack --entry='./src/encoding/p50k_edit.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_edit' --env 'filename=p50k_edit.js'","build:umd:r50k_base":"beemo webpack --entry='./src/encoding/r50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_r50k_base' --env 'filename=r50k_base.js'","build:umd:o200k_base":"beemo webpack --entry='./src/encoding/o200k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_o200k_base' --env 'filename=o200k_base.js'","clean":"git clean -dfX --exclude=node_modules src && beemo typescript:sync-project-refs","format":"yarn rrun prettier --write \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","postinstallDev":"yarn prepare","prepare":"rrun husky install .config/husky && beemo create-config","release":"beemo run-script release","test":"yarn test:format && yarn test:types && yarn test:lint && yarn test:code","test:code":"rrun jest","test:format":"yarn rrun prettier --check \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","test:lint":"rrun eslint 'src/*.{js,jsx,ts,tsx}'","test:types":"yarn rrun tsc --noEmit"},"release":{"branches":["+([0-9])?(.{+([0-9]),x}).x","master",{"name":"main","channel":false},"next",{"name":"beta","prerelease":true},{"name":"alpha","prerelease":true}],"tagFormat":"${version}"},"devDependencies":{"@niieani/scaffold":"^1.7.32","tsx":"^4.19.1","typescript":"^5.6.2"},"resolutions":{"typescript":"5.6.2"},"packageManager":"yarn@4.5.0","publishConfig":{"access":"public"},"types":"./esm/main.d.ts","gitHead":"e2c560aafeda84dcbec61880d552ffbaa69deaac","_id":"gpt-tokenizer@2.3.0","_nodeVersion":"22.9.0","_npmVersion":"8.19.4","dist":{"integrity":"sha512-hmY2ECvld67c0j9MCjSN98lQmRdLQV8S3OkpLwOLlqgTqm0ALCzcUjgdMYrthDPS5eGP7qwVpcH+EQ4R9PsCFQ==","shasum":"a500f90a69be35a6d290b7a8094b9293e0b0d43b","tarball":"http://123.232.10.234:8212/nexus/content/groups/npm-public/gpt-tokenizer/-/gpt-tokenizer-2.3.0.tgz","fileCount":603,"unpackedSize":40424742,"signatures":[{"keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA","sig":"MEUCIQDEm4x/ZtPuE/HdhJbTF4heFET5iZK3mRsQ60EgagrSwQIga+vkbQJLb564Bo7LzksONbnToHKTc6GXK3TauLOCseE="}],"size":13903915},"_npmUser":{"name":"anonymous","email":"npm@invent.life"},"directories":{},"maintainers":[{"name":"anonymous","email":"npm@invent.life"}],"_npmOperationalInternal":{"host":"s3://npm-registry-packages","tmp":"tmp/gpt-tokenizer_2.3.0_1726799854809_0.6971077872613936"},"_hasShrinkwrap":false,"_cnpmcore_publish_time":"2024-09-20T02:37:35.224Z","publish_time":1726799855224,"_source_registry_name":"default"},"2.4.0":{"name":"gpt-tokenizer","version":"2.4.0","description":"A pure JavaScript implementation of a BPE tokenizer (Encoder/Decoder) for GPT-2 / GPT-3 / GPT-4 and other OpenAI models","keywords":["BPE","encoder","decoder","tokenizer","GPT","GPT-2","GPT-3","GPT-3.5","GPT-4","GPT-4o","NLP","Natural Language Processing","Text Generation","OpenAI","Machine Learning","ml"],"homepage":"https://github.com/niieani/gpt-tokenizer#readme","bugs":{"url":"https://github.com/niieani/gpt-tokenizer/issues"},"repository":{"type":"git","url":"git+https://github.com/niieani/gpt-tokenizer.git"},"license":"MIT","author":{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"},"contributors":[{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"}],"exports":{".":{"import":"./esm/main.js","require":"./cjs/main.js"},"./*":{"import":"./esm/*.js","require":"./cjs/*.js"},"./cjs":{"require":"./cjs/main.js"},"./cjs/*":{"require":"./cjs/*.js"},"./esm/*":{"import":"./esm/*.js"},"./data/*":{"import":"./data/*","require":"./data/*"},"./package.json":"./package.json"},"main":"esm/main.js","unpkg":"dist/cl100k_base.js","module":"esm/main.js","source":"src/main.ts","scripts":{"codegen:models":"yarn tsx src/codegen/generateByModel.ts","codegen:encodings":"yarn tsx src/codegen/generateJsEncodings.ts","build":"yarn build:cjs && yarn build:esm && yarn build:umd","build:cjs":"yarn rrun tsc --outDir cjs --module commonjs --target es2022 --project tsconfig-cjs.json","build:esm":"mkdir -p esm && echo '{\"name\": \"gpt-tokenizer\", \"type\": \"module\"}' > ./esm/package.json && yarn rrun tsc --outDir esm --module esnext --target es2022","build:umd":"yarn build:umd:cl100k_base && yarn build:umd:p50k_base && yarn build:umd:p50k_edit && yarn build:umd:r50k_base && yarn build:umd:o200k_base","build:umd:cl100k_base":"beemo webpack --entry='./src/main.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_cl100k_base' --env 'filename=cl100k_base.js'","build:umd:p50k_base":"beemo webpack --entry='./src/encoding/p50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_base' --env 'filename=p50k_base.js'","build:umd:p50k_edit":"beemo webpack --entry='./src/encoding/p50k_edit.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_edit' --env 'filename=p50k_edit.js'","build:umd:r50k_base":"beemo webpack --entry='./src/encoding/r50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_r50k_base' --env 'filename=r50k_base.js'","build:umd:o200k_base":"beemo webpack --entry='./src/encoding/o200k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_o200k_base' --env 'filename=o200k_base.js'","clean":"git clean -dfX --exclude=node_modules src && beemo typescript:sync-project-refs","format":"yarn rrun prettier --write \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","postinstallDev":"yarn prepare","prepare":"rrun husky install .config/husky && beemo create-config","release":"beemo run-script release","test":"yarn test:format && yarn test:types && yarn test:lint && yarn test:code","test:code":"rrun jest","test:format":"yarn rrun prettier --check \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","test:lint":"rrun eslint 'src/*.{js,jsx,ts,tsx}'","test:types":"yarn rrun tsc --noEmit"},"release":{"branches":["+([0-9])?(.{+([0-9]),x}).x","master",{"name":"main","channel":false},"next",{"name":"beta","prerelease":true},{"name":"alpha","prerelease":true}],"tagFormat":"${version}"},"devDependencies":{"@niieani/scaffold":"^1.7.32","tsx":"^4.19.1","typescript":"^5.6.2"},"resolutions":{"typescript":"5.6.2"},"packageManager":"yarn@4.5.0","publishConfig":{"access":"public"},"types":"./esm/main.d.ts","gitHead":"f49b46d29d97ef1be697fa8229fd0dbdaf4d8f99","_id":"gpt-tokenizer@2.4.0","_nodeVersion":"22.9.0","_npmVersion":"8.19.4","dist":{"integrity":"sha512-UmipHgPmzOVSj1Nu0bajJt7eZYnIttoaUSfgsXs77jFlpKGFBodM0fpa7989rl/pUEONKjye9WqAhlerHKux9Q==","shasum":"bde91fd57dc6dcf2f954597538ccf177bef60512","tarball":"http://123.232.10.234:8212/nexus/content/groups/npm-public/gpt-tokenizer/-/gpt-tokenizer-2.4.0.tgz","fileCount":603,"unpackedSize":40484739,"signatures":[{"keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA","sig":"MEUCIQCqaEAzneqYY/VkD7bI/Wuct6t+o3e6WOqonHZFl+zTDAIgUD2TlIfQbCFA7E6ya5HIwDEvv/DnJfuw7K7EqO8FZss="}],"size":13911065},"_npmUser":{"name":"anonymous","email":"npm@invent.life"},"directories":{},"maintainers":[{"name":"anonymous","email":"npm@invent.life"}],"_npmOperationalInternal":{"host":"s3://npm-registry-packages","tmp":"tmp/gpt-tokenizer_2.4.0_1727059632408_0.7929296072598728"},"_hasShrinkwrap":false,"_cnpmcore_publish_time":"2024-09-23T02:47:12.850Z","publish_time":1727059632850,"_source_registry_name":"default"},"2.4.1":{"name":"gpt-tokenizer","version":"2.4.1","description":"A pure JavaScript implementation of a BPE tokenizer (Encoder/Decoder) for GPT-2 / GPT-3 / GPT-4 and other OpenAI models","keywords":["BPE","encoder","decoder","tokenizer","GPT","GPT-2","GPT-3","GPT-3.5","GPT-4","GPT-4o","NLP","Natural Language Processing","Text Generation","OpenAI","Machine Learning","ml"],"homepage":"https://github.com/niieani/gpt-tokenizer#readme","bugs":{"url":"https://github.com/niieani/gpt-tokenizer/issues"},"repository":{"type":"git","url":"git+https://github.com/niieani/gpt-tokenizer.git"},"license":"MIT","author":{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"},"contributors":[{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"}],"exports":{".":{"import":"./esm/main.js","require":"./cjs/main.js"},"./*":{"import":"./esm/*.js","require":"./cjs/*.js"},"./cjs":{"require":"./cjs/main.js"},"./cjs/*":{"require":"./cjs/*.js"},"./esm/*":{"import":"./esm/*.js"},"./data/*":{"import":"./data/*","require":"./data/*"},"./package.json":"./package.json"},"main":"esm/main.js","unpkg":"dist/cl100k_base.js","module":"esm/main.js","source":"src/main.ts","scripts":{"codegen:models":"yarn tsx src/codegen/generateByModel.ts","codegen:encodings":"yarn tsx src/codegen/generateJsEncodings.ts","build":"yarn build:cjs && yarn build:esm && yarn build:umd","build:cjs":"yarn rrun tsc --outDir cjs --module commonjs --target es2022 --project tsconfig-cjs.json","build:esm":"mkdir -p esm && echo '{\"name\": \"gpt-tokenizer\", \"type\": \"module\"}' > ./esm/package.json && yarn rrun tsc --outDir esm --module esnext --target es2022","build:umd":"yarn build:umd:cl100k_base && yarn build:umd:p50k_base && yarn build:umd:p50k_edit && yarn build:umd:r50k_base && yarn build:umd:o200k_base","build:umd:cl100k_base":"beemo webpack --entry='./src/main.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_cl100k_base' --env 'filename=cl100k_base.js'","build:umd:p50k_base":"beemo webpack --entry='./src/encoding/p50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_base' --env 'filename=p50k_base.js'","build:umd:p50k_edit":"beemo webpack --entry='./src/encoding/p50k_edit.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_edit' --env 'filename=p50k_edit.js'","build:umd:r50k_base":"beemo webpack --entry='./src/encoding/r50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_r50k_base' --env 'filename=r50k_base.js'","build:umd:o200k_base":"beemo webpack --entry='./src/encoding/o200k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_o200k_base' --env 'filename=o200k_base.js'","clean":"git clean -dfX --exclude=node_modules src && beemo typescript:sync-project-refs","format":"yarn rrun prettier --write \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","postinstallDev":"yarn prepare","prepare":"rrun husky install .config/husky && beemo create-config","release":"beemo run-script release","test":"yarn test:format && yarn test:types && yarn test:lint && yarn test:code","test:code":"rrun jest","test:format":"yarn rrun prettier --check \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","test:lint":"rrun eslint 'src/*.{js,jsx,ts,tsx}'","test:types":"yarn rrun tsc --noEmit"},"release":{"branches":["+([0-9])?(.{+([0-9]),x}).x","master",{"name":"main","channel":false},"next",{"name":"beta","prerelease":true},{"name":"alpha","prerelease":true}],"tagFormat":"${version}"},"devDependencies":{"@niieani/scaffold":"^1.7.32","tsx":"^4.19.1","typescript":"^5.6.2"},"resolutions":{"typescript":"5.6.2"},"packageManager":"yarn@4.5.0","publishConfig":{"access":"public"},"types":"./esm/main.d.ts","gitHead":"bf4b459d8d99903264698f606bdd9a31ca0b724f","_id":"gpt-tokenizer@2.4.1","_nodeVersion":"22.9.0","_npmVersion":"8.19.4","dist":{"integrity":"sha512-toR/iMboG04xErwQGdtf4KuW4dPsiRqIh8qQ5i+Us8PW0iThUf+FqY0rNUMA8hRauf3kJttR7uVae7MNgx4OXA==","shasum":"77769429d0f85c8d16d9a3c842b4c808d10051d2","tarball":"http://123.232.10.234:8212/nexus/content/groups/npm-public/gpt-tokenizer/-/gpt-tokenizer-2.4.1.tgz","fileCount":603,"unpackedSize":40484739,"signatures":[{"keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA","sig":"MEUCICFdG3N0pLYIWb2y3Ynok/tm4paj0bDrhYixFfRuA77TAiEAk+wFq2aGATky6sRB0NoMlfr85aczrP5BD3AU7QUXkr0="}],"size":13911068},"_npmUser":{"name":"anonymous","email":"npm@invent.life"},"directories":{},"maintainers":[{"name":"anonymous","email":"npm@invent.life"}],"_npmOperationalInternal":{"host":"s3://npm-registry-packages","tmp":"tmp/gpt-tokenizer_2.4.1_1728271627720_0.3773363103113583"},"_hasShrinkwrap":false,"_cnpmcore_publish_time":"2024-10-07T03:27:08.175Z","publish_time":1728271628175,"_source_registry_name":"default"},"2.5.0":{"name":"gpt-tokenizer","version":"2.5.0","description":"A pure JavaScript implementation of a BPE tokenizer (Encoder/Decoder) for GPT-2 / GPT-3 / GPT-4 and other OpenAI models","keywords":["BPE","encoder","decoder","tokenizer","GPT","GPT-2","GPT-3","GPT-3.5","GPT-4","GPT-4o","NLP","Natural Language Processing","Text Generation","OpenAI","Machine Learning","ml"],"homepage":"https://github.com/niieani/gpt-tokenizer#readme","bugs":{"url":"https://github.com/niieani/gpt-tokenizer/issues"},"repository":{"type":"git","url":"git+https://github.com/niieani/gpt-tokenizer.git"},"license":"MIT","author":{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"},"contributors":[{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"}],"exports":{".":{"import":"./esm/main.js","require":"./cjs/main.js"},"./*":{"import":"./esm/*.js","require":"./cjs/*.js"},"./cjs":{"require":"./cjs/main.js"},"./cjs/*":{"require":"./cjs/*.js"},"./esm/*":{"import":"./esm/*.js"},"./data/*":{"import":"./data/*","require":"./data/*"},"./package.json":"./package.json"},"main":"esm/main.js","unpkg":"dist/cl100k_base.js","module":"esm/main.js","source":"src/main.ts","scripts":{"codegen:models":"yarn tsx src/codegen/generateByModel.ts","codegen:encodings":"yarn tsx src/codegen/generateJsEncodings.ts","build":"yarn build:cjs && yarn build:esm && yarn build:umd","build:cjs":"yarn rrun tsc --outDir cjs --module commonjs --target es2022 --project tsconfig-cjs.json","build:esm":"mkdir -p esm && echo '{\"name\": \"gpt-tokenizer\", \"type\": \"module\"}' > ./esm/package.json && yarn rrun tsc --outDir esm --module esnext --target es2022","build:umd":"yarn build:umd:cl100k_base && yarn build:umd:p50k_base && yarn build:umd:p50k_edit && yarn build:umd:r50k_base && yarn build:umd:o200k_base","build:umd:cl100k_base":"beemo webpack --entry='./src/main.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_cl100k_base' --env 'filename=cl100k_base.js'","build:umd:p50k_base":"beemo webpack --entry='./src/encoding/p50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_base' --env 'filename=p50k_base.js'","build:umd:p50k_edit":"beemo webpack --entry='./src/encoding/p50k_edit.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_edit' --env 'filename=p50k_edit.js'","build:umd:r50k_base":"beemo webpack --entry='./src/encoding/r50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_r50k_base' --env 'filename=r50k_base.js'","build:umd:o200k_base":"beemo webpack --entry='./src/encoding/o200k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_o200k_base' --env 'filename=o200k_base.js'","clean":"git clean -dfX --exclude=node_modules src && beemo typescript:sync-project-refs","format":"yarn rrun prettier --write \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","postinstallDev":"yarn prepare","prepare":"rrun husky install .config/husky && beemo create-config","release":"beemo run-script release","test":"yarn test:format && yarn test:types && yarn test:lint && yarn test:code","test:code":"rrun jest","test:format":"yarn rrun prettier --check \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","test:lint":"rrun eslint 'src/*.{js,jsx,ts,tsx}'","test:types":"yarn rrun tsc --noEmit"},"release":{"branches":["+([0-9])?(.{+([0-9]),x}).x","master",{"name":"main","channel":false},"next",{"name":"beta","prerelease":true},{"name":"alpha","prerelease":true}],"tagFormat":"${version}"},"devDependencies":{"@niieani/scaffold":"^1.7.32","tsx":"^4.19.1","typescript":"^5.6.2"},"resolutions":{"typescript":"5.6.2"},"packageManager":"yarn@4.5.0","publishConfig":{"access":"public"},"types":"./esm/main.d.ts","gitHead":"41673afe7078c73d439583ffd470b6c52ed4f625","_id":"gpt-tokenizer@2.5.0","_nodeVersion":"22.9.0","_npmVersion":"8.19.4","dist":{"integrity":"sha512-O/hZmEqmDNjhj2LsO5Ly8Y8927vyI5J1FLbutkFkRo/S1AredRU/Mt2RFy3O71B3Bm6REz0a+3ezBVeYQaCS8Q==","shasum":"610486e543098be0c0e161196c577c620d4bdb53","tarball":"http://123.232.10.234:8212/nexus/content/groups/npm-public/gpt-tokenizer/-/gpt-tokenizer-2.5.0.tgz","fileCount":617,"unpackedSize":40511073,"signatures":[{"keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA","sig":"MEYCIQCqybpVX9jjwaaEX9oQHWdoy4MdTpuID+a72lRnYEp/1QIhAOPN7lG0pyItShNrkYnYJYgon7SqJmKKp09fsL69Ohtn"}],"size":13911839},"_npmUser":{"name":"anonymous","email":"npm@invent.life"},"directories":{},"maintainers":[{"name":"anonymous","email":"npm@invent.life"}],"_npmOperationalInternal":{"host":"s3://npm-registry-packages","tmp":"tmp/gpt-tokenizer_2.5.0_1728454149892_0.6538520983773857"},"_hasShrinkwrap":false,"_cnpmcore_publish_time":"2024-10-09T06:09:10.316Z","publish_time":1728454150316,"_source_registry_name":"default"},"2.5.1":{"name":"gpt-tokenizer","version":"2.5.1","description":"A pure JavaScript implementation of a BPE tokenizer (Encoder/Decoder) for GPT-2 / GPT-3 / GPT-4 and other OpenAI models","keywords":["BPE","encoder","decoder","tokenizer","GPT","GPT-2","GPT-3","GPT-3.5","GPT-4","GPT-4o","NLP","Natural Language Processing","Text Generation","OpenAI","Machine Learning","ml"],"homepage":"https://github.com/niieani/gpt-tokenizer#readme","bugs":{"url":"https://github.com/niieani/gpt-tokenizer/issues"},"repository":{"type":"git","url":"git+https://github.com/niieani/gpt-tokenizer.git"},"license":"MIT","author":{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"},"contributors":[{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"}],"exports":{".":{"import":"./esm/main.js","require":"./cjs/main.js"},"./*":{"import":"./esm/*.js","require":"./cjs/*.js"},"./cjs":{"require":"./cjs/main.js"},"./cjs/*":{"require":"./cjs/*.js"},"./esm/*":{"import":"./esm/*.js"},"./data/*":{"import":"./data/*","require":"./data/*"},"./package.json":"./package.json"},"main":"esm/main.js","unpkg":"dist/cl100k_base.js","module":"esm/main.js","source":"src/main.ts","scripts":{"codegen:models":"yarn tsx src/codegen/generateByModel.ts","codegen:encodings":"yarn tsx src/codegen/generateJsEncodings.ts","build":"yarn build:cjs && yarn build:esm && yarn build:umd","build:cjs":"yarn rrun tsc --outDir cjs --module commonjs --target es2022 --project tsconfig-cjs.json","build:esm":"mkdir -p esm && echo '{\"name\": \"gpt-tokenizer\", \"type\": \"module\"}' > ./esm/package.json && yarn rrun tsc --outDir esm --module esnext --target es2022","build:umd":"yarn build:umd:cl100k_base && yarn build:umd:p50k_base && yarn build:umd:p50k_edit && yarn build:umd:r50k_base && yarn build:umd:o200k_base","build:umd:cl100k_base":"beemo webpack --entry='./src/main.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_cl100k_base' --env 'filename=cl100k_base.js'","build:umd:p50k_base":"beemo webpack --entry='./src/encoding/p50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_base' --env 'filename=p50k_base.js'","build:umd:p50k_edit":"beemo webpack --entry='./src/encoding/p50k_edit.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_edit' --env 'filename=p50k_edit.js'","build:umd:r50k_base":"beemo webpack --entry='./src/encoding/r50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_r50k_base' --env 'filename=r50k_base.js'","build:umd:o200k_base":"beemo webpack --entry='./src/encoding/o200k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_o200k_base' --env 'filename=o200k_base.js'","clean":"git clean -dfX --exclude=node_modules src && beemo typescript:sync-project-refs","format":"yarn rrun prettier --write \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","postinstallDev":"yarn prepare","prepare":"rrun husky install .config/husky && beemo create-config","release":"beemo run-script release","test":"yarn test:format && yarn test:types && yarn test:lint && yarn test:code","test:code":"rrun jest","test:format":"yarn rrun prettier --check \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","test:lint":"rrun eslint 'src/*.{js,jsx,ts,tsx}'","test:types":"yarn rrun tsc --noEmit"},"release":{"branches":["+([0-9])?(.{+([0-9]),x}).x","master",{"name":"main","channel":false},"next",{"name":"beta","prerelease":true},{"name":"alpha","prerelease":true}],"tagFormat":"${version}"},"devDependencies":{"@niieani/scaffold":"^1.7.32","tsx":"^4.19.1","typescript":"^5.6.2"},"resolutions":{"typescript":"5.6.2"},"packageManager":"yarn@4.5.0","publishConfig":{"access":"public"},"types":"./esm/main.d.ts","gitHead":"2305f148309ab3306822ade577ddf0850e1a8659","_id":"gpt-tokenizer@2.5.1","_nodeVersion":"22.10.0","_npmVersion":"8.19.4","dist":{"integrity":"sha512-26zNjvGrIf+a6yWg5l2DvNT4LXAmotHyx7IomHVhXiUs62BwKVFLv/l8yRQQrkUDc2XDtzCdjcNuJqzOjxxiPA==","shasum":"ff1175b9ae1325f0f5281e9797af078cb29295dc","tarball":"http://123.232.10.234:8212/nexus/content/groups/npm-public/gpt-tokenizer/-/gpt-tokenizer-2.5.1.tgz","fileCount":617,"unpackedSize":40511073,"signatures":[{"keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA","sig":"MEYCIQCpbiZ2IfU/RBwJ0iB+O8Ck0fBfAtR4R8/HeV5Khh8iaQIhAPhXRKvFyRSaG/UbOdEX8fnHL5BFNwT8t8NgElYWiU8/"}],"size":13911842},"_npmUser":{"name":"anonymous","email":"npm@invent.life"},"directories":{},"maintainers":[{"name":"anonymous","email":"npm@invent.life"}],"_npmOperationalInternal":{"host":"s3://npm-registry-packages","tmp":"tmp/gpt-tokenizer_2.5.1_1729481272058_0.9712988121262129"},"_hasShrinkwrap":false,"_cnpmcore_publish_time":"2024-10-21T03:27:52.559Z","publish_time":1729481272559,"_source_registry_name":"default"},"2.6.0":{"name":"gpt-tokenizer","version":"2.6.0","description":"A pure JavaScript implementation of a BPE tokenizer (Encoder/Decoder) for GPT-2 / GPT-3 / GPT-4 and other OpenAI models","keywords":["BPE","encoder","decoder","tokenizer","GPT","GPT-2","GPT-3","GPT-3.5","GPT-4","GPT-4o","NLP","Natural Language Processing","Text Generation","OpenAI","Machine Learning","ml"],"homepage":"https://github.com/niieani/gpt-tokenizer#readme","bugs":{"url":"https://github.com/niieani/gpt-tokenizer/issues"},"repository":{"type":"git","url":"git+https://github.com/niieani/gpt-tokenizer.git"},"license":"MIT","author":{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"},"contributors":[{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"}],"exports":{".":{"import":"./esm/main.js","require":"./cjs/main.js"},"./*":{"import":"./esm/*.js","require":"./cjs/*.js"},"./cjs":{"require":"./cjs/main.js"},"./cjs/*":{"require":"./cjs/*.js"},"./esm/*":{"import":"./esm/*.js"},"./data/*":{"import":"./data/*","require":"./data/*"},"./package.json":"./package.json"},"main":"esm/main.js","unpkg":"dist/cl100k_base.js","module":"esm/main.js","source":"src/main.ts","scripts":{"codegen:models":"rm -rf src/model && yarn tsx src/codegen/generateByModel.ts","codegen:bpe":"rm -rf src/bpeRanks && yarn tsx src/codegen/generateJsBpe.ts","build":"yarn build:cjs && yarn build:esm && yarn build:umd","build:cjs":"yarn rrun tsc --outDir cjs --module commonjs --target es2022 --project tsconfig-cjs.json","build:esm":"mkdir -p esm && echo '{\"name\": \"gpt-tokenizer\", \"type\": \"module\"}' > ./esm/package.json && yarn rrun tsc --outDir esm --module esnext --target es2022","build:umd":"yarn build:umd:cl100k_base && yarn build:umd:p50k_base && yarn build:umd:p50k_edit && yarn build:umd:r50k_base && yarn build:umd:o200k_base","build:umd:cl100k_base":"beemo webpack --entry='./src/main.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_cl100k_base' --env 'filename=cl100k_base.js'","build:umd:p50k_base":"beemo webpack --entry='./src/encoding/p50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_base' --env 'filename=p50k_base.js'","build:umd:p50k_edit":"beemo webpack --entry='./src/encoding/p50k_edit.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_edit' --env 'filename=p50k_edit.js'","build:umd:r50k_base":"beemo webpack --entry='./src/encoding/r50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_r50k_base' --env 'filename=r50k_base.js'","build:umd:o200k_base":"beemo webpack --entry='./src/encoding/o200k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_o200k_base' --env 'filename=o200k_base.js'","clean":"git clean -dfX --exclude=node_modules src && beemo typescript:sync-project-refs","format":"yarn rrun prettier --write \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","postinstallDev":"yarn prepare","prepare":"rrun husky install .config/husky && beemo create-config","release":"beemo run-script release","test":"yarn test:format && yarn test:types && yarn test:lint && yarn test:code","test:code":"rrun jest","test:format":"yarn rrun prettier --check \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","test:lint":"rrun eslint 'src/*.{js,jsx,ts,tsx}'","test:types":"yarn rrun tsc --noEmit"},"release":{"branches":["+([0-9])?(.{+([0-9]),x}).x","master",{"name":"main","channel":false},"next",{"name":"beta","prerelease":true},{"name":"alpha","prerelease":true}],"tagFormat":"${version}"},"devDependencies":{"@niieani/scaffold":"^1.7.32","tsx":"^4.19.1","typescript":"^5.6.2"},"resolutions":{"typescript":"5.6.2"},"packageManager":"yarn@4.5.0","publishConfig":{"access":"public"},"types":"./esm/main.d.ts","gitHead":"bc964675d2fc795e1692619d1907093944b2e8ff","_id":"gpt-tokenizer@2.6.0","_nodeVersion":"22.11.0","_npmVersion":"8.19.4","dist":{"integrity":"sha512-4NzSmroimN+yEg8KFmG+URerBtVHrOIXhcohn5TgmaKWzVOIeZ5AJshQzI3lJybYnLk4HWHWy/deIw+VSfcw2g==","shasum":"2d941218b1f8826f0780965a94977079ca14ae98","tarball":"http://123.232.10.234:8212/nexus/content/groups/npm-public/gpt-tokenizer/-/gpt-tokenizer-2.6.0.tgz","fileCount":771,"unpackedSize":41468012,"signatures":[{"keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA","sig":"MEUCIQC4165P3xjuKF0k0oZ32gxa9f7h5JEN7gtHaBg8TDby9AIgFh+RSErR0T1yQlACmZZIU2JCPv8PUe4svyz+wjIWzI4="}],"size":14027900},"_npmUser":{"name":"anonymous","email":"npm@invent.life"},"directories":{},"maintainers":[{"name":"anonymous","email":"npm@invent.life"}],"_npmOperationalInternal":{"host":"s3://npm-registry-packages","tmp":"tmp/gpt-tokenizer_2.6.0_1730699762703_0.643265584626536"},"_hasShrinkwrap":false,"_cnpmcore_publish_time":"2024-11-04T05:56:03.078Z","publish_time":1730699763078,"_source_registry_name":"default"},"2.6.1":{"name":"gpt-tokenizer","version":"2.6.1","description":"A pure JavaScript implementation of a BPE tokenizer (Encoder/Decoder) for GPT-2 / GPT-3 / GPT-4 and other OpenAI models","keywords":["BPE","encoder","decoder","tokenizer","GPT","GPT-2","GPT-3","GPT-3.5","GPT-4","GPT-4o","NLP","Natural Language Processing","Text Generation","OpenAI","Machine Learning","ml"],"homepage":"https://github.com/niieani/gpt-tokenizer#readme","bugs":{"url":"https://github.com/niieani/gpt-tokenizer/issues"},"repository":{"type":"git","url":"git+https://github.com/niieani/gpt-tokenizer.git"},"license":"MIT","author":{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"},"contributors":[{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"}],"exports":{".":{"import":"./esm/main.js","require":"./cjs/main.js"},"./*":{"import":"./esm/*.js","require":"./cjs/*.js"},"./cjs":{"require":"./cjs/main.js"},"./cjs/*":{"require":"./cjs/*.js"},"./esm/*":{"import":"./esm/*.js"},"./data/*":{"import":"./data/*","require":"./data/*"},"./package.json":"./package.json"},"main":"esm/main.js","unpkg":"dist/cl100k_base.js","module":"esm/main.js","source":"src/main.ts","scripts":{"codegen:models":"rm -rf src/model && yarn tsx src/codegen/generateByModel.ts","codegen:bpe":"rm -rf src/bpeRanks && yarn tsx src/codegen/generateJsBpe.ts","build":"yarn build:cjs && yarn build:esm && yarn build:umd","build:cjs":"yarn rrun tsc --outDir cjs --module commonjs --target es2022 --project tsconfig-cjs.json","build:esm":"mkdir -p esm && echo '{\"name\": \"gpt-tokenizer\", \"type\": \"module\"}' > ./esm/package.json && yarn rrun tsc --outDir esm --target es2022","build:umd":"yarn build:umd:cl100k_base && yarn build:umd:p50k_base && yarn build:umd:p50k_edit && yarn build:umd:r50k_base && yarn build:umd:o200k_base","build:umd:cl100k_base":"beemo webpack --entry='./src/main.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_cl100k_base' --env 'filename=cl100k_base.js'","build:umd:p50k_base":"beemo webpack --entry='./src/encoding/p50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_base' --env 'filename=p50k_base.js'","build:umd:p50k_edit":"beemo webpack --entry='./src/encoding/p50k_edit.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_edit' --env 'filename=p50k_edit.js'","build:umd:r50k_base":"beemo webpack --entry='./src/encoding/r50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_r50k_base' --env 'filename=r50k_base.js'","build:umd:o200k_base":"beemo webpack --entry='./src/encoding/o200k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_o200k_base' --env 'filename=o200k_base.js'","clean":"git clean -dfX --exclude=node_modules src && beemo typescript:sync-project-refs","format":"yarn rrun prettier --write \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","postinstallDev":"yarn prepare","prepare":"rrun husky install .config/husky && beemo create-config","release":"beemo run-script release","test":"yarn test:format && yarn test:types && yarn test:lint && yarn test:code","test:code":"rrun jest","test:format":"yarn rrun prettier --check \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","test:lint":"rrun eslint 'src/*.{js,jsx,ts,tsx}'","test:types":"yarn rrun tsc --noEmit"},"release":{"branches":["+([0-9])?(.{+([0-9]),x}).x","master",{"name":"main","channel":false},"next",{"name":"beta","prerelease":true},{"name":"alpha","prerelease":true}],"tagFormat":"${version}"},"devDependencies":{"@niieani/scaffold":"^1.7.37","tsx":"^4.19.2","typescript":"^5.6.3"},"resolutions":{"typescript":"5.6.2"},"packageManager":"yarn@4.5.0","publishConfig":{"access":"public"},"types":"./esm/main.d.ts","gitHead":"ed01980d8c88e28bcff8e3514e78b9d6c2f486d5","_id":"gpt-tokenizer@2.6.1","_nodeVersion":"22.10.0","_npmVersion":"8.19.4","dist":{"integrity":"sha512-o/gLZb9EFV4ivihD1SfKip6+WjVQLPx14w6QDJJ8S1S6fEiGlgiAlVvpdWSL9keUwOexDwm7DdsfedsDJFQKmg==","shasum":"7099ffd12f4a8499369b9c3361873915a788c068","tarball":"http://123.232.10.234:8212/nexus/content/groups/npm-public/gpt-tokenizer/-/gpt-tokenizer-2.6.1.tgz","fileCount":771,"unpackedSize":41525353,"signatures":[{"keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA","sig":"MEUCIQC1EuSAGrQcaZQBb2/wFqt0V1k2NKGwG0zEOrnhO+HajQIgTNCUlHlpeAyALi6uyQFkKe9D/Q93qSzL7bj9vFGiyos="}],"size":14041646},"_npmUser":{"name":"anonymous","email":"npm@invent.life"},"directories":{},"maintainers":[{"name":"anonymous","email":"npm@invent.life"}],"_npmOperationalInternal":{"host":"s3://npm-registry-packages","tmp":"tmp/gpt-tokenizer_2.6.1_1731285510303_0.39437149547932715"},"_hasShrinkwrap":false,"_cnpmcore_publish_time":"2024-11-11T00:38:30.681Z","publish_time":1731285510681,"_source_registry_name":"default"},"2.6.2":{"name":"gpt-tokenizer","version":"2.6.2","description":"A pure JavaScript implementation of a BPE tokenizer (Encoder/Decoder) for GPT-2 / GPT-3 / GPT-4 and other OpenAI models","keywords":["BPE","encoder","decoder","tokenizer","GPT","GPT-2","GPT-3","GPT-3.5","GPT-4","GPT-4o","NLP","Natural Language Processing","Text Generation","OpenAI","Machine Learning","ml"],"homepage":"https://github.com/niieani/gpt-tokenizer#readme","bugs":{"url":"https://github.com/niieani/gpt-tokenizer/issues"},"repository":{"type":"git","url":"git+https://github.com/niieani/gpt-tokenizer.git"},"license":"MIT","author":{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"},"contributors":[{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"}],"exports":{".":{"import":"./esm/main.js","require":"./cjs/main.js"},"./*":{"import":"./esm/*.js","require":"./cjs/*.js"},"./cjs":{"require":"./cjs/main.js"},"./cjs/*":{"require":"./cjs/*.js"},"./esm/*":{"import":"./esm/*.js"},"./data/*":{"import":"./data/*","require":"./data/*"},"./package.json":"./package.json"},"main":"esm/main.js","unpkg":"dist/cl100k_base.js","module":"esm/main.js","source":"src/main.ts","scripts":{"codegen:models":"rm -rf src/model && yarn tsx src/codegen/generateByModel.ts","codegen:bpe":"rm -rf src/bpeRanks && yarn tsx src/codegen/generateJsBpe.ts","build":"yarn build:cjs && yarn build:esm && yarn build:umd","build:cjs":"yarn rrun tsc --outDir cjs --module commonjs --target es2022 --project tsconfig-cjs.json","build:esm":"mkdir -p esm && echo '{\"name\": \"gpt-tokenizer\", \"type\": \"module\"}' > ./esm/package.json && yarn rrun tsc --outDir esm --target es2022","build:umd":"yarn build:umd:cl100k_base && yarn build:umd:p50k_base && yarn build:umd:p50k_edit && yarn build:umd:r50k_base && yarn build:umd:o200k_base","build:umd:cl100k_base":"beemo webpack --entry='./src/main.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_cl100k_base' --env 'filename=cl100k_base.js'","build:umd:p50k_base":"beemo webpack --entry='./src/encoding/p50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_base' --env 'filename=p50k_base.js'","build:umd:p50k_edit":"beemo webpack --entry='./src/encoding/p50k_edit.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_edit' --env 'filename=p50k_edit.js'","build:umd:r50k_base":"beemo webpack --entry='./src/encoding/r50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_r50k_base' --env 'filename=r50k_base.js'","build:umd:o200k_base":"beemo webpack --entry='./src/encoding/o200k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_o200k_base' --env 'filename=o200k_base.js'","clean":"git clean -dfX --exclude=node_modules src && beemo typescript:sync-project-refs","format":"yarn rrun prettier --write \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","postinstallDev":"yarn prepare","prepare":"rrun husky install .config/husky && beemo create-config","release":"beemo run-script release","test":"yarn test:format && yarn test:types && yarn test:lint && yarn test:code","test:code":"rrun jest","test:format":"yarn rrun prettier --check \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","test:lint":"rrun eslint 'src/*.{js,jsx,ts,tsx}'","test:types":"yarn rrun tsc --noEmit"},"release":{"branches":["+([0-9])?(.{+([0-9]),x}).x","master",{"name":"main","channel":false},"next",{"name":"beta","prerelease":true},{"name":"alpha","prerelease":true}],"tagFormat":"${version}"},"devDependencies":{"@niieani/scaffold":"^1.7.37","tsx":"^4.19.2","typescript":"^5.6.3"},"resolutions":{"typescript":"5.6.2"},"packageManager":"yarn@4.5.0","publishConfig":{"access":"public"},"types":"./esm/main.d.ts","gitHead":"3547826b37e829009a40d421a3733a54d13cd452","_id":"gpt-tokenizer@2.6.2","_nodeVersion":"22.10.0","_npmVersion":"8.19.4","dist":{"integrity":"sha512-OznIET3z069FiwbLtLFXJ9pVESYAa8EnX0BMogs6YJ4Fn2FIcyeZYEbxsp2grPiK0DVaqP1f+0JR/8t9R7/jlg==","shasum":"90e6932c7b5f73df7c13d360802edb43a2776586","tarball":"http://123.232.10.234:8212/nexus/content/groups/npm-public/gpt-tokenizer/-/gpt-tokenizer-2.6.2.tgz","fileCount":778,"unpackedSize":41554334,"signatures":[{"keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA","sig":"MEQCIG6bmgDgSkdwbk5340HVCfjnOi24NfmilssnLieHKnxZAiAgg4QiF2TgM7ARh383shCwtvz+9szl8zeyzv/n2fdRiA=="}],"size":14045041},"_npmUser":{"name":"anonymous","email":"npm@invent.life"},"directories":{},"maintainers":[{"name":"anonymous","email":"npm@invent.life"}],"_npmOperationalInternal":{"host":"s3://npm-registry-packages","tmp":"tmp/gpt-tokenizer_2.6.2_1731479172414_0.6110996188381312"},"_hasShrinkwrap":false,"_cnpmcore_publish_time":"2024-11-13T06:26:12.906Z","publish_time":1731479172906,"_source_registry_name":"default"},"2.7.0":{"name":"gpt-tokenizer","version":"2.7.0","description":"A pure JavaScript implementation of a BPE tokenizer (Encoder/Decoder) for GPT-2 / GPT-3 / GPT-4 and other OpenAI models","keywords":["BPE","encoder","decoder","tokenizer","GPT","GPT-2","GPT-3","GPT-3.5","GPT-4","GPT-4o","NLP","Natural Language Processing","Text Generation","OpenAI","Machine Learning","ml"],"homepage":"https://github.com/niieani/gpt-tokenizer#readme","bugs":{"url":"https://github.com/niieani/gpt-tokenizer/issues"},"repository":{"type":"git","url":"git+https://github.com/niieani/gpt-tokenizer.git"},"license":"MIT","author":{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"},"contributors":[{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"}],"exports":{".":{"import":"./esm/main.js","require":"./cjs/main.js"},"./*":{"import":"./esm/*.js","require":"./cjs/*.js"},"./cjs":{"require":"./cjs/main.js"},"./cjs/*":{"require":"./cjs/*.js"},"./esm/*":{"import":"./esm/*.js"},"./data/*":{"import":"./data/*","require":"./data/*"},"./package.json":"./package.json"},"main":"esm/main.js","unpkg":"dist/cl100k_base.js","module":"esm/main.js","source":"src/main.ts","scripts":{"codegen:models":"rm -rf src/model && yarn tsx src/codegen/generateByModel.ts","codegen:bpe":"rm -rf src/bpeRanks && yarn tsx src/codegen/generateJsBpe.ts","build":"yarn build:cjs && yarn build:esm && yarn build:umd","build:cjs":"yarn rrun tsc --outDir cjs --module commonjs --target es2022 --project tsconfig-cjs.json","build:esm":"mkdir -p esm && echo '{\"name\": \"gpt-tokenizer\", \"type\": \"module\"}' > ./esm/package.json && yarn rrun tsc --outDir esm --target es2022","build:umd":"yarn build:umd:cl100k_base && yarn build:umd:p50k_base && yarn build:umd:p50k_edit && yarn build:umd:r50k_base && yarn build:umd:o200k_base","build:umd:cl100k_base":"beemo webpack --entry='./src/main.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_cl100k_base' --env 'filename=cl100k_base.js'","build:umd:p50k_base":"beemo webpack --entry='./src/encoding/p50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_base' --env 'filename=p50k_base.js'","build:umd:p50k_edit":"beemo webpack --entry='./src/encoding/p50k_edit.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_edit' --env 'filename=p50k_edit.js'","build:umd:r50k_base":"beemo webpack --entry='./src/encoding/r50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_r50k_base' --env 'filename=r50k_base.js'","build:umd:o200k_base":"beemo webpack --entry='./src/encoding/o200k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_o200k_base' --env 'filename=o200k_base.js'","clean":"git clean -dfX --exclude=node_modules src && beemo typescript:sync-project-refs","format":"yarn rrun prettier --write \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","postinstallDev":"yarn prepare","prepare":"rrun husky install .config/husky && beemo create-config","release":"beemo run-script release","test":"yarn test:format && yarn test:types && yarn test:lint && yarn test:code","test:code":"rrun jest","test:format":"yarn rrun prettier --check \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","test:lint":"rrun eslint 'src/*.{js,jsx,ts,tsx}'","test:types":"yarn rrun tsc --noEmit"},"release":{"branches":["+([0-9])?(.{+([0-9]),x}).x","master",{"name":"main","channel":false},"next",{"name":"beta","prerelease":true},{"name":"alpha","prerelease":true}],"tagFormat":"${version}"},"devDependencies":{"@niieani/scaffold":"^1.7.37","tsx":"^4.19.2","typescript":"^5.6.3"},"resolutions":{"typescript":"5.6.2"},"packageManager":"yarn@4.5.0","publishConfig":{"access":"public"},"types":"./esm/main.d.ts","gitHead":"2d4146a064d9dc6c2512bf6c869b05f2d18ce741","_id":"gpt-tokenizer@2.7.0","_nodeVersion":"22.11.0","_npmVersion":"8.19.4","dist":{"integrity":"sha512-QjxaGgCZgKp8ecZzy7AmrCbYs+DD+y7GWSRwbe2ZiHPBs1EaK8xUIrt8irnmkAQcNMflpD27tk5yF4m9ig3wgw==","shasum":"30cb445dd3102ca921c446db300f97a4a9d8a577","tarball":"http://123.232.10.234:8212/nexus/content/groups/npm-public/gpt-tokenizer/-/gpt-tokenizer-2.7.0.tgz","fileCount":778,"unpackedSize":41564172,"signatures":[{"keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA","sig":"MEQCIGm9Km6i0hlzJCrVOkDvi6frZFCB+QM2LVNuqHS6hiICAiB/bmkOHpZ/IdzKJL80F9cURoLNfDfxr91MPnbYfgVHaQ=="}],"size":14046331},"_npmUser":{"name":"anonymous","email":"npm@invent.life"},"directories":{},"maintainers":[{"name":"anonymous","email":"npm@invent.life"}],"_npmOperationalInternal":{"host":"s3://npm-registry-packages","tmp":"tmp/gpt-tokenizer_2.7.0_1732758631417_0.9713624200684079"},"_hasShrinkwrap":false,"_cnpmcore_publish_time":"2024-11-28T01:50:32.288Z","publish_time":1732758632288,"_source_registry_name":"default"},"2.8.0":{"name":"gpt-tokenizer","version":"2.8.0","keywords":["BPE","encoder","decoder","tokenizer","GPT","GPT-2","GPT-3","GPT-3.5","GPT-4","GPT-4o","NLP","Natural Language Processing","Text Generation","OpenAI","Machine Learning","ml"],"author":{"url":"https://github.com/niieani","name":"Bazyli Brzoska","email":"npm@invent.life"},"license":"MIT","_id":"gpt-tokenizer@2.8.0","maintainers":[{"name":"anonymous","email":"npm@invent.life"}],"contributors":[{"url":"https://github.com/niieani","name":"Bazyli Brzoska","email":"npm@invent.life"}],"homepage":"https://github.com/niieani/gpt-tokenizer#readme","bugs":{"url":"https://github.com/niieani/gpt-tokenizer/issues"},"dist":{"shasum":"d4b8a1e3089d74a6c0dde338ab784b0b29385e19","tarball":"http://123.232.10.234:8212/nexus/content/groups/npm-public/gpt-tokenizer/-/gpt-tokenizer-2.8.0.tgz","fileCount":792,"integrity":"sha512-SdFDehttglkjsc4SqH+CUGZK/Kb73mAXDBQltUTC+CZbQ8gYh1Ad/rZ1cPRHz8+1o4dHxLXuM6KrlxjgcXR/dA==","signatures":[{"sig":"MEUCIQCEvC/fog59sI5ij25QhWYe/J3WrCVH59xBD8Et0me5EQIgXHyBaie+jM0nFGYoYD4YSmHH/h6S9H1t60SFgmbrLJc=","keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA"}],"unpackedSize":41653822,"size":14061329},"main":"esm/main.js","types":"./esm/main.d.ts","unpkg":"dist/cl100k_base.js","module":"esm/main.js","source":"src/main.ts","exports":{".":{"import":"./esm/main.js","require":"./cjs/main.js"},"./*":{"import":"./esm/*.js","require":"./cjs/*.js"},"./cjs":{"require":"./cjs/main.js"},"./cjs/*":{"require":"./cjs/*.js"},"./esm/*":{"import":"./esm/*.js"},"./data/*":{"import":"./data/*","require":"./data/*"},"./package.json":"./package.json"},"gitHead":"15d13b1a35047d531efda795200257183b892a93","release":{"branches":["+([0-9])?(.{+([0-9]),x}).x","master",{"name":"main","channel":false},"next",{"name":"beta","prerelease":true},{"name":"alpha","prerelease":true}],"tagFormat":"${version}"},"scripts":{"test":"yarn test:format && yarn test:types && yarn test:lint && yarn test:code","build":"yarn build:cjs && yarn build:esm && yarn build:umd","clean":"git clean -dfX --exclude=node_modules src && beemo typescript:sync-project-refs","format":"yarn rrun prettier --write \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","prepare":"rrun husky install .config/husky && beemo create-config","release":"beemo run-script release","build:cjs":"yarn rrun tsc --outDir cjs --module commonjs --target es2022 --project tsconfig-cjs.json","build:esm":"mkdir -p esm && echo '{\"name\": \"gpt-tokenizer\", \"type\": \"module\"}' > ./esm/package.json && yarn rrun tsc --outDir esm --target es2022","build:umd":"yarn build:umd:cl100k_base && yarn build:umd:p50k_base && yarn build:umd:p50k_edit && yarn build:umd:r50k_base && yarn build:umd:o200k_base","test:code":"rrun jest","test:lint":"rrun eslint 'src/*.{js,jsx,ts,tsx}'","test:types":"yarn rrun tsc --noEmit","codegen:bpe":"rm -rf src/bpeRanks && yarn tsx src/codegen/generateJsBpe.ts","test:format":"yarn rrun prettier --check \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","codegen:models":"rm -rf src/model && yarn tsx src/codegen/generateByModel.ts","postinstallDev":"yarn prepare","build:umd:p50k_base":"beemo webpack --entry='./src/encoding/p50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_base' --env 'filename=p50k_base.js'","build:umd:p50k_edit":"beemo webpack --entry='./src/encoding/p50k_edit.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_edit' --env 'filename=p50k_edit.js'","build:umd:r50k_base":"beemo webpack --entry='./src/encoding/r50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_r50k_base' --env 'filename=r50k_base.js'","build:umd:o200k_base":"beemo webpack --entry='./src/encoding/o200k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_o200k_base' --env 'filename=o200k_base.js'","build:umd:cl100k_base":"beemo webpack --entry='./src/main.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_cl100k_base' --env 'filename=cl100k_base.js'"},"_npmUser":{"name":"anonymous","email":"npm@invent.life"},"repository":{"url":"git+https://github.com/niieani/gpt-tokenizer.git","type":"git"},"_npmVersion":"8.19.4","description":"A pure JavaScript implementation of a BPE tokenizer (Encoder/Decoder) for GPT-2 / GPT-3 / GPT-4 and other OpenAI models","directories":{},"resolutions":{"typescript":"5.6.2"},"_nodeVersion":"22.11.0","publishConfig":{"access":"public"},"_hasShrinkwrap":false,"packageManager":"yarn@4.5.0","devDependencies":{"tsx":"^4.19.2","typescript":"^5.6.3","@niieani/scaffold":"^1.7.37"},"_npmOperationalInternal":{"tmp":"tmp/gpt-tokenizer_2.8.0_1733737428075_0.4958467721608746","host":"s3://npm-registry-packages-npm-production"},"_cnpmcore_publish_time":"2024-12-09T09:43:48.404Z","publish_time":1733737428404,"_source_registry_name":"default"},"2.8.1":{"name":"gpt-tokenizer","version":"2.8.1","description":"A pure JavaScript implementation of a BPE tokenizer (Encoder/Decoder) for GPT-2 / GPT-3 / GPT-4 and other OpenAI models","keywords":["BPE","encoder","decoder","tokenizer","GPT","GPT-2","GPT-3","GPT-3.5","GPT-4","GPT-4o","NLP","Natural Language Processing","Text Generation","OpenAI","Machine Learning","ml"],"homepage":"https://github.com/niieani/gpt-tokenizer#readme","bugs":{"url":"https://github.com/niieani/gpt-tokenizer/issues"},"repository":{"type":"git","url":"git+https://github.com/niieani/gpt-tokenizer.git"},"license":"MIT","author":{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"},"contributors":[{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"}],"exports":{".":{"import":"./esm/main.js","require":"./cjs/main.js"},"./*":{"import":"./esm/*.js","require":"./cjs/*.js"},"./cjs":{"require":"./cjs/main.js"},"./cjs/*":{"require":"./cjs/*.js"},"./esm/*":{"import":"./esm/*.js"},"./data/*":{"import":"./data/*","require":"./data/*"},"./package.json":"./package.json"},"main":"esm/main.js","unpkg":"dist/cl100k_base.js","module":"esm/main.js","source":"src/main.ts","scripts":{"codegen:models":"rm -rf src/model && yarn tsx src/codegen/generateByModel.ts","codegen:bpe":"rm -rf src/bpeRanks && yarn tsx src/codegen/generateJsBpe.ts","build":"yarn build:cjs && yarn build:esm && yarn build:umd","build:cjs":"yarn rrun tsc --outDir cjs --module commonjs --target es2022 --project tsconfig-cjs.json","build:esm":"mkdir -p esm && echo '{\"name\": \"gpt-tokenizer\", \"type\": \"module\"}' > ./esm/package.json && yarn rrun tsc --outDir esm --target es2022","build:umd":"yarn build:umd:cl100k_base && yarn build:umd:p50k_base && yarn build:umd:p50k_edit && yarn build:umd:r50k_base && yarn build:umd:o200k_base","build:umd:cl100k_base":"beemo webpack --entry='./src/main.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_cl100k_base' --env 'filename=cl100k_base.js'","build:umd:p50k_base":"beemo webpack --entry='./src/encoding/p50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_base' --env 'filename=p50k_base.js'","build:umd:p50k_edit":"beemo webpack --entry='./src/encoding/p50k_edit.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_edit' --env 'filename=p50k_edit.js'","build:umd:r50k_base":"beemo webpack --entry='./src/encoding/r50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_r50k_base' --env 'filename=r50k_base.js'","build:umd:o200k_base":"beemo webpack --entry='./src/encoding/o200k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_o200k_base' --env 'filename=o200k_base.js'","clean":"git clean -dfX --exclude=node_modules src && beemo typescript:sync-project-refs","format":"yarn rrun prettier --write \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","postinstallDev":"yarn prepare","prepare":"rrun husky install .config/husky && beemo create-config","release":"beemo run-script release","test":"yarn test:format && yarn test:types && yarn test:lint && yarn test:code","test:code":"rrun jest","test:format":"yarn rrun prettier --check \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","test:lint":"rrun eslint 'src/*.{js,jsx,ts,tsx}'","test:types":"yarn rrun tsc --noEmit"},"release":{"branches":["+([0-9])?(.{+([0-9]),x}).x","master",{"name":"main","channel":false},"next",{"name":"beta","prerelease":true},{"name":"alpha","prerelease":true}],"tagFormat":"${version}"},"devDependencies":{"@niieani/scaffold":"^1.7.37","tsx":"^4.19.2","typescript":"^5.6.3"},"resolutions":{"typescript":"5.6.2"},"packageManager":"yarn@4.5.0","publishConfig":{"access":"public"},"types":"./esm/main.d.ts","gitHead":"0f70122bf0b7f91b71039d46857a36c8f1848f63","_id":"gpt-tokenizer@2.8.1","_nodeVersion":"22.11.0","_npmVersion":"8.19.4","dist":{"integrity":"sha512-8+a9ojzqfgiF3TK4oivGYjlycD8g5igLt8NQw3ndOIgLVKSGJDhUDNAfYSbtyyuTkha3R/R9F8XrwC7/B5TKfQ==","shasum":"505d5d05ed8db9871ad6f50adcbb87111654134c","tarball":"http://123.232.10.234:8212/nexus/content/groups/npm-public/gpt-tokenizer/-/gpt-tokenizer-2.8.1.tgz","fileCount":792,"unpackedSize":41660616,"signatures":[{"keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA","sig":"MEQCIG6QN/Q9oJbL95mp8ZPyIgmPc0QRyqDMaLxt7qsK06rzAiA2l2VBqjjfpGBUagA+YtGgEfmSIWsRO2VIEw15faiG0A=="}],"size":14063124},"_npmUser":{"name":"anonymous","email":"npm@invent.life"},"directories":{},"maintainers":[{"name":"anonymous","email":"npm@invent.life"}],"_npmOperationalInternal":{"host":"s3://npm-registry-packages-npm-production","tmp":"tmp/gpt-tokenizer_2.8.1_1733739109386_0.31844005498970485"},"_hasShrinkwrap":false,"_cnpmcore_publish_time":"2024-12-09T10:11:50.050Z","publish_time":1733739110050,"_source_registry_name":"default"},"2.9.0":{"name":"gpt-tokenizer","version":"2.9.0","description":"A pure JavaScript implementation of a BPE tokenizer (Encoder/Decoder) for GPT-2 / GPT-3 / GPT-4 and other OpenAI models","keywords":["BPE","encoder","decoder","tokenizer","GPT","GPT-2","GPT-3","GPT-3.5","GPT-4","GPT-4o","NLP","Natural Language Processing","Text Generation","OpenAI","Machine Learning","ml"],"homepage":"https://github.com/niieani/gpt-tokenizer#readme","bugs":{"url":"https://github.com/niieani/gpt-tokenizer/issues"},"repository":{"type":"git","url":"git+https://github.com/niieani/gpt-tokenizer.git"},"license":"MIT","author":{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"},"contributors":[{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"}],"exports":{".":{"import":"./esm/main.js","require":"./cjs/main.js"},"./*":{"import":"./esm/*.js","require":"./cjs/*.js"},"./cjs":{"require":"./cjs/main.js"},"./cjs/*":{"require":"./cjs/*.js"},"./esm/*":{"import":"./esm/*.js"},"./data/*":{"import":"./data/*","require":"./data/*"},"./package.json":"./package.json"},"main":"esm/main.js","unpkg":"dist/cl100k_base.js","module":"esm/main.js","source":"src/main.ts","scripts":{"codegen:models":"rm -rf src/model && yarn tsx src/codegen/generateByModel.ts","codegen:bpe":"rm -rf src/bpeRanks && yarn tsx src/codegen/generateJsBpe.ts","build":"yarn build:cjs && yarn build:esm && yarn build:umd","build:cjs":"yarn rrun tsc --outDir cjs --module commonjs --target es2022 --project tsconfig-cjs.json","build:esm":"mkdir -p esm && echo '{\"name\": \"gpt-tokenizer\", \"type\": \"module\"}' > ./esm/package.json && yarn rrun tsc --outDir esm --target es2022","build:umd":"yarn build:umd:cl100k_base && yarn build:umd:p50k_base && yarn build:umd:p50k_edit && yarn build:umd:r50k_base && yarn build:umd:o200k_base","build:umd:cl100k_base":"beemo webpack --entry='./src/main.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_cl100k_base' --env 'filename=cl100k_base.js'","build:umd:p50k_base":"beemo webpack --entry='./src/encoding/p50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_base' --env 'filename=p50k_base.js'","build:umd:p50k_edit":"beemo webpack --entry='./src/encoding/p50k_edit.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_edit' --env 'filename=p50k_edit.js'","build:umd:r50k_base":"beemo webpack --entry='./src/encoding/r50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_r50k_base' --env 'filename=r50k_base.js'","build:umd:o200k_base":"beemo webpack --entry='./src/encoding/o200k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_o200k_base' --env 'filename=o200k_base.js'","clean":"git clean -dfX --exclude=node_modules src && beemo typescript:sync-project-refs","format":"yarn rrun prettier --write \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","postinstallDev":"yarn prepare","prepare":"rrun husky install .config/husky && beemo create-config","release":"beemo run-script release","test":"yarn test:format && yarn test:types && yarn test:lint && yarn test:code","test:code":"vitest","test:format":"yarn rrun prettier --check \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","test:lint":"rrun eslint 'src/*.{js,jsx,ts,tsx}'","test:types":"yarn rrun tsc --noEmit"},"release":{"branches":["+([0-9])?(.{+([0-9]),x}).x","master",{"name":"main","channel":false},"next",{"name":"beta","prerelease":true},{"name":"alpha","prerelease":true}],"tagFormat":"${version}"},"devDependencies":{"@edge-runtime/vm":"^5.0.0","@niieani/scaffold":"^1.7.39","@swc/cli":"^0.5.2","@swc/core":"^1.10.4","tsx":"^4.19.2","typescript":"^5.7.2","vitest":"^2.1.8"},"resolutions":{"typescript":"5.7.2"},"packageManager":"yarn@4.5.0","publishConfig":{"access":"public"},"types":"./esm/main.d.ts","gitHead":"e2506c229c542384928e35b34a2d3ed07cf68a10","_id":"gpt-tokenizer@2.9.0","_nodeVersion":"22.14.0","_npmVersion":"8.19.4","dist":{"integrity":"sha512-YSpexBL/k4bfliAzMrRqn3M6+it02LutVyhVpDeMKrC/O9+pCe/5s8U2hYKa2vFLD5/vHhsKc8sOn/qGqII8Kg==","shasum":"1f0639fa6667c8fae2ecda6245dbd4bde3b2745f","tarball":"http://123.232.10.234:8212/nexus/content/groups/npm-public/gpt-tokenizer/-/gpt-tokenizer-2.9.0.tgz","fileCount":883,"unpackedSize":42220888,"signatures":[{"keyid":"SHA256:DhQ8wR5APBvFHLF/+Tc+AYvPOdTpcIDqOhxsBHRwC7U","sig":"MEUCIQD4RYUb0cK9nIW78FEj/p/CEErhw4o1Ob6L5ECjuo6kKwIgOp5vd9xOadLxdBSlNoNxkwizfYJ6NveAbtIl6otgshc="}],"size":14086111},"_npmUser":{"name":"anonymous","email":"npm@invent.life"},"directories":{},"maintainers":[{"name":"anonymous","email":"npm@invent.life"}],"_npmOperationalInternal":{"host":"s3://npm-registry-packages-npm-production","tmp":"tmp/gpt-tokenizer_2.9.0_1741142440113_0.993543359821554"},"_hasShrinkwrap":false,"_cnpmcore_publish_time":"2025-03-05T02:40:40.480Z","publish_time":1741142440480,"_source_registry_name":"default"},"3.0.0":{"name":"gpt-tokenizer","version":"3.0.0","description":"A pure JavaScript implementation of a BPE tokenizer (Encoder/Decoder) for GPT-2 / GPT-3 / GPT-4 and other OpenAI models","keywords":["BPE","encoder","decoder","tokenizer","GPT","GPT-2","GPT-3","GPT-3.5","GPT-4","GPT-4o","NLP","Natural Language Processing","Text Generation","OpenAI","Machine Learning","ml"],"homepage":"https://github.com/niieani/gpt-tokenizer#readme","bugs":{"url":"https://github.com/niieani/gpt-tokenizer/issues"},"repository":{"type":"git","url":"git+https://github.com/niieani/gpt-tokenizer.git"},"license":"MIT","author":{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"},"contributors":[{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"}],"exports":{".":{"import":"./esm/main.js","require":"./cjs/main.js"},"./*":{"import":"./esm/*.js","require":"./cjs/*.js"},"./cjs":{"require":"./cjs/main.js"},"./cjs/*":{"require":"./cjs/*.js"},"./esm/*":{"import":"./esm/*.js"},"./data/*":{"import":"./data/*","require":"./data/*"},"./package.json":"./package.json"},"main":"esm/main.js","unpkg":"dist/cl100k_base.js","module":"esm/main.js","source":"src/main.ts","scripts":{"codegen":"yarn codegen:bpe && yarn codegen:chat-enabled && yarn codegen:models","codegen:models":"rm -rf src/model && node --experimental-transform-types --import node-resolve-ts/register src/codegen/generateByModel.ts","codegen:bpe":"rm -rf src/bpeRanks && node --experimental-transform-types --import node-resolve-ts/register src/codegen/generateJsBpe.ts","codegen:chat-enabled":"rm -rf src/chat && node --experimental-transform-types --import node-resolve-ts/register src/codegen/generateChatEnabled.ts","build":"yarn build:cjs && yarn build:esm && yarn build:umd","build:cjs":"yarn rrun tsc --outDir cjs --module commonjs --target es2022 --project tsconfig-cjs.json","build:esm":"mkdir -p esm && echo '{\"name\": \"gpt-tokenizer\", \"type\": \"module\"}' > ./esm/package.json && yarn rrun tsc --outDir esm --target es2022","build:umd":"yarn build:umd:cl100k_base && yarn build:umd:p50k_base && yarn build:umd:p50k_edit && yarn build:umd:r50k_base && yarn build:umd:o200k_base","build:umd:cl100k_base":"beemo webpack --entry='./src/main.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_cl100k_base' --env 'filename=cl100k_base.js'","build:umd:p50k_base":"beemo webpack --entry='./src/encoding/p50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_base' --env 'filename=p50k_base.js'","build:umd:p50k_edit":"beemo webpack --entry='./src/encoding/p50k_edit.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_edit' --env 'filename=p50k_edit.js'","build:umd:r50k_base":"beemo webpack --entry='./src/encoding/r50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_r50k_base' --env 'filename=r50k_base.js'","build:umd:o200k_base":"beemo webpack --entry='./src/encoding/o200k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_o200k_base' --env 'filename=o200k_base.js'","clean":"git clean -dfX --exclude=node_modules src && beemo typescript:sync-project-refs","format":"yarn rrun prettier --write \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","postinstallDev":"yarn prepare","prepare":"rrun husky install .config/husky && beemo create-config && echo '\n**/*.gen.ts\nsrc/models.ts' >> .prettierignore","release":"beemo run-script release","test":"yarn test:format && yarn test:types && yarn test:lint && yarn test:code","test:code":"vitest","test:format":"yarn rrun prettier --check \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\" --ignore-path .prettierignore","test:lint":"rrun eslint 'src/*.{js,jsx,ts,tsx}'","test:types":"yarn rrun tsc --noEmit"},"release":{"branches":["+([0-9])?(.{+([0-9]),x}).x","master",{"name":"main","channel":false},"next",{"name":"beta","prerelease":true},{"name":"alpha","prerelease":true}],"tagFormat":"${version}"},"devDependencies":{"@edge-runtime/vm":"^5.0.0","@niieani/scaffold":"^1.7.49","@swc/cli":"^0.7.7","@swc/core":"^1.11.31","devalue":"^5.1.1","node-resolve-ts":"^1.0.2","typescript":"^5.8.3","vitest":"^3.2.2"},"resolutions":{"typescript":"5.8.3","prettier":"^3"},"packageManager":"yarn@4.9.2","publishConfig":{"access":"public"},"types":"./esm/main.d.ts","gitHead":"1d1d76d86bd22d89b89ebba09f5f7b4eed04c4eb","_id":"gpt-tokenizer@3.0.0","_nodeVersion":"22.15.0","_npmVersion":"8.19.4","dist":{"integrity":"sha512-S7SWrQHkl73wpZEJjeIh9Gfx2aVHudAEadezI2/SSvqRMtQiUjjHDah4m8LWTxcw9Fx9VphNXpqdnMo3FapDFg==","shasum":"883b5ea37c4e71d64a210eb497ab6a6a30bd721d","tarball":"http://123.232.10.234:8212/nexus/content/groups/npm-public/gpt-tokenizer/-/gpt-tokenizer-3.0.0.tgz","fileCount":1108,"unpackedSize":46020571,"signatures":[{"keyid":"SHA256:DhQ8wR5APBvFHLF/+Tc+AYvPOdTpcIDqOhxsBHRwC7U","sig":"MEQCICE81mJGmeqNqXiYsUFCwbBihmgZg+DE0LCSJ1nCwSPMAiBuYbvHSFxSfKg0+EQXEp7RtXfQ4nO/XbaH8q4JKiqdGg=="}],"size":15364672},"_npmUser":{"name":"anonymous","email":"npm@invent.life"},"directories":{},"maintainers":[{"name":"anonymous","email":"npm@invent.life"}],"_npmOperationalInternal":{"host":"s3://npm-registry-packages-npm-production","tmp":"tmp/gpt-tokenizer_3.0.0_1749336660692_0.06316216244411166"},"_hasShrinkwrap":false,"_cnpmcore_publish_time":"2025-06-07T22:51:01.186Z","publish_time":1749336661186,"_source_registry_name":"default"},"3.0.1":{"name":"gpt-tokenizer","version":"3.0.1","description":"A pure JavaScript implementation of a BPE tokenizer (Encoder/Decoder) for GPT-2 / GPT-3 / GPT-4 and other OpenAI models","keywords":["BPE","encoder","decoder","tokenizer","GPT","GPT-2","GPT-3","GPT-3.5","GPT-4","GPT-4o","NLP","Natural Language Processing","Text Generation","OpenAI","Machine Learning","ml"],"homepage":"https://github.com/niieani/gpt-tokenizer#readme","bugs":{"url":"https://github.com/niieani/gpt-tokenizer/issues"},"repository":{"type":"git","url":"git+https://github.com/niieani/gpt-tokenizer.git"},"license":"MIT","author":{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"},"contributors":[{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"}],"exports":{".":{"import":"./esm/main.js","require":"./cjs/main.js"},"./*":{"import":"./esm/*.js","require":"./cjs/*.js"},"./cjs":{"require":"./cjs/main.js"},"./cjs/*":{"require":"./cjs/*.js"},"./esm/*":{"import":"./esm/*.js"},"./data/*":{"import":"./data/*","require":"./data/*"},"./package.json":"./package.json"},"main":"esm/main.js","unpkg":"dist/cl100k_base.js","module":"esm/main.js","source":"src/main.ts","scripts":{"codegen":"yarn codegen:bpe && yarn codegen:chat-enabled && yarn codegen:models","codegen:models":"rm -rf src/model && node --experimental-transform-types --import node-resolve-ts/register src/codegen/generateByModel.ts","codegen:bpe":"rm -rf src/bpeRanks && node --experimental-transform-types --import node-resolve-ts/register src/codegen/generateJsBpe.ts","codegen:chat-enabled":"rm -rf src/chat && node --experimental-transform-types --import node-resolve-ts/register src/codegen/generateChatEnabled.ts","build":"yarn build:cjs && yarn build:esm && yarn build:umd","build:cjs":"yarn rrun tsc --outDir cjs --module commonjs --target es2022 --project tsconfig-cjs.json","build:esm":"mkdir -p esm && echo '{\"name\": \"gpt-tokenizer\", \"type\": \"module\"}' > ./esm/package.json && yarn rrun tsc --outDir esm --target es2022","build:umd":"yarn build:umd:cl100k_base && yarn build:umd:p50k_base && yarn build:umd:p50k_edit && yarn build:umd:r50k_base && yarn build:umd:o200k_base","build:umd:cl100k_base":"beemo webpack --entry='./src/main.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_cl100k_base' --env 'filename=cl100k_base.js'","build:umd:p50k_base":"beemo webpack --entry='./src/encoding/p50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_base' --env 'filename=p50k_base.js'","build:umd:p50k_edit":"beemo webpack --entry='./src/encoding/p50k_edit.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_edit' --env 'filename=p50k_edit.js'","build:umd:r50k_base":"beemo webpack --entry='./src/encoding/r50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_r50k_base' --env 'filename=r50k_base.js'","build:umd:o200k_base":"beemo webpack --entry='./src/encoding/o200k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_o200k_base' --env 'filename=o200k_base.js'","clean":"git clean -dfX --exclude=node_modules src && beemo typescript:sync-project-refs","format":"yarn rrun prettier --write \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","postinstallDev":"yarn prepare","prepare":"rrun husky install .config/husky && beemo create-config && echo '\n**/*.gen.ts\nsrc/models.ts' >> .prettierignore","release":"beemo run-script release","test":"yarn test:format && yarn test:types && yarn test:lint && yarn test:code","test:code":"vitest","test:format":"yarn rrun prettier --check \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\" --ignore-path .prettierignore","test:lint":"rrun eslint 'src/*.{js,jsx,ts,tsx}'","test:types":"yarn rrun tsc --noEmit"},"release":{"branches":["+([0-9])?(.{+([0-9]),x}).x","master",{"name":"main","channel":false},"next",{"name":"beta","prerelease":true},{"name":"alpha","prerelease":true}],"tagFormat":"${version}"},"devDependencies":{"@edge-runtime/vm":"^5.0.0","@niieani/scaffold":"^1.7.49","@swc/cli":"^0.7.7","@swc/core":"^1.11.31","devalue":"^5.1.1","node-resolve-ts":"^1.0.2","typescript":"^5.8.3","vitest":"^3.2.2"},"resolutions":{"typescript":"5.8.3","prettier":"^3"},"packageManager":"yarn@4.9.2","publishConfig":{"access":"public"},"types":"./esm/main.d.ts","gitHead":"f77db29c1034fbf254b6162d20d4874fc4d158b9","_id":"gpt-tokenizer@3.0.1","_nodeVersion":"22.16.0","_npmVersion":"8.19.4","dist":{"integrity":"sha512-5jdaspBq/w4sWw322SvQj1Fku+CN4OAfYZeeEg8U7CWtxBz+zkxZ3h0YOHD43ee+nZYZ5Ud70HRN0ANcdIj4qg==","shasum":"19fa42314d15b69a1e82d3898336b5ba1f4f2c86","tarball":"http://123.232.10.234:8212/nexus/content/groups/npm-public/gpt-tokenizer/-/gpt-tokenizer-3.0.1.tgz","fileCount":1122,"unpackedSize":46090654,"signatures":[{"keyid":"SHA256:DhQ8wR5APBvFHLF/+Tc+AYvPOdTpcIDqOhxsBHRwC7U","sig":"MEYCIQCnzyTRB68PuYcgvpdeTa2pDVPno00ajEwUwAUdfOO1xQIhAO15QXIJXIDo5xwdmUqzN+Tod1wYNTrmpNT99jdXDyXX"}],"size":15367304},"_npmUser":{"name":"anonymous","email":"npm@invent.life"},"directories":{},"maintainers":[{"name":"anonymous","email":"npm@invent.life"}],"_npmOperationalInternal":{"host":"s3://npm-registry-packages-npm-production","tmp":"tmp/gpt-tokenizer_3.0.1_1749791051098_0.7489273387289539"},"_hasShrinkwrap":false,"_cnpmcore_publish_time":"2025-06-13T05:04:11.499Z","publish_time":1749791051499,"_source_registry_name":"default"},"3.1.0":{"name":"gpt-tokenizer","version":"3.1.0","description":"A pure JavaScript implementation of a BPE tokenizer (Encoder/Decoder) for GPT-2 / GPT-3 / GPT-4 and other OpenAI models","keywords":["BPE","encoder","decoder","tokenizer","GPT","GPT-2","GPT-3","GPT-3.5","GPT-4","GPT-4o","NLP","Natural Language Processing","Text Generation","OpenAI","Machine Learning","ml"],"homepage":"https://github.com/niieani/gpt-tokenizer#readme","bugs":{"url":"https://github.com/niieani/gpt-tokenizer/issues"},"repository":{"type":"git","url":"git+https://github.com/niieani/gpt-tokenizer.git"},"license":"MIT","author":{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"},"contributors":[{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"}],"exports":{".":{"import":"./esm/main.js","require":"./cjs/main.js"},"./*":{"import":"./esm/*.js","require":"./cjs/*.js"},"./cjs":{"require":"./cjs/main.js"},"./cjs/*":{"require":"./cjs/*.js"},"./esm/*":{"import":"./esm/*.js"},"./data/*":{"import":"./data/*","require":"./data/*"},"./package.json":"./package.json"},"main":"esm/main.js","unpkg":"dist/cl100k_base.js","module":"esm/main.js","source":"src/main.ts","scripts":{"codegen":"yarn codegen:bpe && yarn codegen:chat-enabled && yarn codegen:models","codegen:models":"rm -rf src/model && node --experimental-transform-types --import node-resolve-ts/register src/codegen/generateByModel.ts","codegen:bpe":"rm -rf src/bpeRanks && node --experimental-transform-types --import node-resolve-ts/register src/codegen/generateJsBpe.ts","codegen:chat-enabled":"rm -rf src/chat && node --experimental-transform-types --import node-resolve-ts/register src/codegen/generateChatEnabled.ts","build":"yarn build:cjs && yarn build:esm && yarn build:umd","build:cjs":"yarn rrun tsc --outDir cjs --module commonjs --target es2022 --project tsconfig-cjs.json","build:esm":"mkdir -p esm && echo '{\"name\": \"gpt-tokenizer\", \"type\": \"module\"}' > ./esm/package.json && yarn rrun tsc --outDir esm --target es2022","build:umd":"yarn build:umd:cl100k_base && yarn build:umd:p50k_base && yarn build:umd:p50k_edit && yarn build:umd:r50k_base && yarn build:umd:o200k_base","build:umd:cl100k_base":"beemo webpack --entry='./src/main.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_cl100k_base' --env 'filename=cl100k_base.js'","build:umd:p50k_base":"beemo webpack --entry='./src/encoding/p50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_base' --env 'filename=p50k_base.js'","build:umd:p50k_edit":"beemo webpack --entry='./src/encoding/p50k_edit.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_edit' --env 'filename=p50k_edit.js'","build:umd:r50k_base":"beemo webpack --entry='./src/encoding/r50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_r50k_base' --env 'filename=r50k_base.js'","build:umd:o200k_base":"beemo webpack --entry='./src/encoding/o200k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_o200k_base' --env 'filename=o200k_base.js'","clean":"git clean -dfX --exclude=node_modules src && beemo typescript:sync-project-refs","format":"yarn rrun prettier --write \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","postinstallDev":"yarn prepare","prepare":"rrun husky install .config/husky && beemo create-config && echo '\n**/*.gen.ts\nsrc/models.ts' >> .prettierignore","release":"beemo run-script release","test":"yarn test:format && yarn test:types && yarn test:lint && yarn test:code","test:code":"vitest","test:format":"yarn rrun prettier --check \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\" --ignore-path .prettierignore","test:lint":"rrun eslint 'src/*.{js,jsx,ts,tsx}'","test:types":"yarn rrun tsc --noEmit"},"release":{"branches":["+([0-9])?(.{+([0-9]),x}).x","master",{"name":"main","channel":false},"next",{"name":"beta","prerelease":true},{"name":"alpha","prerelease":true}],"tagFormat":"${version}"},"devDependencies":{"@edge-runtime/vm":"^5.0.0","@niieani/scaffold":"^1.7.54","@swc/cli":"^0.7.8","@swc/core":"1.13.5","devalue":"^5.3.2","node-resolve-ts":"^1.0.2","typescript":"^5.9.3","vitest":"^3.2.4"},"resolutions":{"typescript":"5.8.3","prettier":"^3"},"packageManager":"yarn@4.10.3","publishConfig":{"access":"public"},"types":"./esm/main.d.ts","gitHead":"135a8513f31459b569b2899f676bc1fa4b4e8ca3","_id":"gpt-tokenizer@3.1.0","_nodeVersion":"22.20.0","_npmVersion":"8.19.4","dist":{"integrity":"sha512-+yvuInbZAFXc0/WqytMGYo6ZlgKUXuRcpQrfbmJEDs4x09BixO1NphmHfsrzMc45aoPn/LgLx8y8ogJXVOPAhg==","shasum":"98c8782a5928fe6356e7c49cdb171070691f3b29","tarball":"http://123.232.10.234:8212/nexus/content/groups/npm-public/gpt-tokenizer/-/gpt-tokenizer-3.1.0.tgz","fileCount":1311,"unpackedSize":47069547,"signatures":[{"keyid":"SHA256:DhQ8wR5APBvFHLF/+Tc+AYvPOdTpcIDqOhxsBHRwC7U","sig":"MEQCIA8Y79/nPb/McpMmUkudXA+Q8jzOmgKaJrgexzSAluc2AiAebYzpRQP5mbeidhADE0Bn86eZcVV+qApjL9J3j2Ru0w=="}],"size":15405449},"_npmUser":{"name":"anonymous","email":"npm@invent.life"},"directories":{},"maintainers":[{"name":"anonymous","email":"npm@invent.life"}],"_npmOperationalInternal":{"host":"s3://npm-registry-packages-npm-production","tmp":"tmp/gpt-tokenizer_3.1.0_1760050040783_0.11573631248962202"},"_hasShrinkwrap":false,"_cnpmcore_publish_time":"2025-10-09T22:47:21.190Z","publish_time":1760050041190,"_source_registry_name":"default"},"3.2.0":{"name":"gpt-tokenizer","version":"3.2.0","description":"A pure JavaScript implementation of a BPE tokenizer (Encoder/Decoder) for GPT-2 / GPT-3 / GPT-4 and other OpenAI models","keywords":["BPE","encoder","decoder","tokenizer","GPT","GPT-2","GPT-3","GPT-3.5","GPT-4","GPT-4o","NLP","Natural Language Processing","Text Generation","OpenAI","Machine Learning","ml"],"homepage":"https://github.com/niieani/gpt-tokenizer#readme","bugs":{"url":"https://github.com/niieani/gpt-tokenizer/issues"},"repository":{"type":"git","url":"git+https://github.com/niieani/gpt-tokenizer.git"},"license":"MIT","author":{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"},"contributors":[{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"}],"exports":{".":{"import":"./esm/main.js","require":"./cjs/main.js"},"./*":{"import":"./esm/*.js","require":"./cjs/*.js"},"./cjs":{"require":"./cjs/main.js"},"./cjs/*":{"require":"./cjs/*.js"},"./esm/*":{"import":"./esm/*.js"},"./data/*":{"import":"./data/*","require":"./data/*"},"./package.json":"./package.json"},"main":"esm/main.js","unpkg":"dist/cl100k_base.js","module":"esm/main.js","source":"src/main.ts","scripts":{"codegen":"yarn codegen:bpe && yarn codegen:chat-enabled && yarn codegen:models","codegen:models":"rm -rf src/model && node --experimental-transform-types --import node-resolve-ts/register src/codegen/generateByModel.ts","codegen:bpe":"rm -rf src/bpeRanks && node --experimental-transform-types --import node-resolve-ts/register src/codegen/generateJsBpe.ts","codegen:chat-enabled":"rm -rf src/chat && node --experimental-transform-types --import node-resolve-ts/register src/codegen/generateChatEnabled.ts","build":"yarn build:cjs && yarn build:esm && yarn build:umd","build:cjs":"yarn rrun tsc --outDir cjs --module commonjs --target es2022 --project tsconfig-cjs.json","build:esm":"mkdir -p esm && echo '{\"name\": \"gpt-tokenizer\", \"type\": \"module\"}' > ./esm/package.json && yarn rrun tsc --outDir esm --target es2022","build:umd":"yarn build:umd:cl100k_base && yarn build:umd:p50k_base && yarn build:umd:p50k_edit && yarn build:umd:r50k_base && yarn build:umd:o200k_base","build:umd:cl100k_base":"beemo webpack --entry='./src/main.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_cl100k_base' --env 'filename=cl100k_base.js'","build:umd:p50k_base":"beemo webpack --entry='./src/encoding/p50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_base' --env 'filename=p50k_base.js'","build:umd:p50k_edit":"beemo webpack --entry='./src/encoding/p50k_edit.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_edit' --env 'filename=p50k_edit.js'","build:umd:r50k_base":"beemo webpack --entry='./src/encoding/r50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_r50k_base' --env 'filename=r50k_base.js'","build:umd:o200k_base":"beemo webpack --entry='./src/encoding/o200k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_o200k_base' --env 'filename=o200k_base.js'","clean":"git clean -dfX --exclude=node_modules src && beemo typescript:sync-project-refs","format":"yarn rrun prettier --write \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","postinstallDev":"yarn prepare","prepare":"rrun husky install .config/husky && beemo create-config && echo '\n**/*.gen.ts\nsrc/models.ts' >> .prettierignore","release":"beemo run-script release","test":"yarn test:format && yarn test:types && yarn test:lint && yarn test:code","test:code":"vitest","test:format":"yarn rrun prettier --check \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\" --ignore-path .prettierignore","test:lint":"rrun eslint 'src/*.{js,jsx,ts,tsx}'","test:types":"yarn rrun tsc --noEmit"},"release":{"branches":["+([0-9])?(.{+([0-9]),x}).x","master",{"name":"main","channel":false},"next",{"name":"beta","prerelease":true},{"name":"alpha","prerelease":true}],"tagFormat":"${version}"},"devDependencies":{"@edge-runtime/vm":"^5.0.0","@niieani/scaffold":"^1.7.54","@swc/cli":"^0.7.8","@swc/core":"1.13.5","devalue":"^5.3.2","node-resolve-ts":"^1.0.2","typescript":"^5.9.3","vitest":"^3.2.4"},"resolutions":{"typescript":"5.8.3","prettier":"^3"},"packageManager":"yarn@4.10.3","publishConfig":{"access":"public"},"types":"./esm/main.d.ts","gitHead":"dcc878381f2b173998141d0adefaf5116c33e6b3","_id":"gpt-tokenizer@3.2.0","_nodeVersion":"22.20.0","_npmVersion":"8.19.4","dist":{"integrity":"sha512-QRRhzJIHcGbbdzhMGNCHhF+98RVaBaVD5+NYgyPOamqRBEqG5yN9p5j4udnY5FJnkesjdUPsikG/DO4OfJdhpQ==","shasum":"84e95c8bb06500efc17a07c0725cf8daaef7f4d0","tarball":"http://123.232.10.234:8212/nexus/content/groups/npm-public/gpt-tokenizer/-/gpt-tokenizer-3.2.0.tgz","fileCount":1311,"unpackedSize":47119610,"signatures":[{"keyid":"SHA256:DhQ8wR5APBvFHLF/+Tc+AYvPOdTpcIDqOhxsBHRwC7U","sig":"MEUCIQCffYqS+iDgdkBlXZLeQWOIXwlYmpJ1ljl14+FRAQTVsAIgVRAOJO9El9Xm4I1O9tuuV06bP/21jrS8CyCxlSdamnQ="}],"size":15409428},"_npmUser":{"name":"anonymous","email":"npm@invent.life"},"directories":{},"maintainers":[{"name":"anonymous","email":"npm@invent.life"}],"_npmOperationalInternal":{"host":"s3://npm-registry-packages-npm-production","tmp":"tmp/gpt-tokenizer_3.2.0_1760051522519_0.9277299286923708"},"_hasShrinkwrap":false,"_cnpmcore_publish_time":"2025-10-09T23:12:02.948Z","publish_time":1760051522948,"_source_registry_name":"default"},"3.3.0":{"name":"gpt-tokenizer","version":"3.3.0","description":"A pure JavaScript implementation of a BPE tokenizer (Encoder/Decoder) for GPT-2 / GPT-3 / GPT-4 and other OpenAI models","keywords":["BPE","encoder","decoder","tokenizer","GPT","GPT-2","GPT-3","GPT-3.5","GPT-4","GPT-4o","NLP","Natural Language Processing","Text Generation","OpenAI","Machine Learning","ml"],"homepage":"https://github.com/niieani/gpt-tokenizer#readme","bugs":{"url":"https://github.com/niieani/gpt-tokenizer/issues"},"repository":{"type":"git","url":"git+https://github.com/niieani/gpt-tokenizer.git"},"license":"MIT","author":{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"},"contributors":[{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"}],"exports":{".":{"import":"./esm/main.js","require":"./cjs/main.js"},"./*":{"import":"./esm/*.js","require":"./cjs/*.js"},"./cjs":{"require":"./cjs/main.js"},"./cjs/*":{"require":"./cjs/*.js"},"./esm/*":{"import":"./esm/*.js"},"./data/*":{"import":"./data/*","require":"./data/*"},"./package.json":"./package.json"},"main":"esm/main.js","unpkg":"dist/cl100k_base.js","module":"esm/main.js","source":"src/main.ts","scripts":{"codegen":"yarn codegen:bpe && yarn codegen:chat-enabled && yarn codegen:models","codegen:models":"rm -rf src/model && node --experimental-transform-types --import node-resolve-ts/register src/codegen/generateByModel.ts","codegen:bpe":"rm -rf src/bpeRanks && node --experimental-transform-types --import node-resolve-ts/register src/codegen/generateJsBpe.ts","codegen:chat-enabled":"rm -rf src/chat && node --experimental-transform-types --import node-resolve-ts/register src/codegen/generateChatEnabled.ts","build":"yarn build:cjs && yarn build:esm && yarn build:umd","build:cjs":"yarn rrun tsc --outDir cjs --module commonjs --target es2022 --project tsconfig-cjs.json","build:esm":"mkdir -p esm && echo '{\"name\": \"gpt-tokenizer\", \"type\": \"module\"}' > ./esm/package.json && yarn rrun tsc --outDir esm --target es2022","build:umd":"yarn build:umd:cl100k_base && yarn build:umd:p50k_base && yarn build:umd:p50k_edit && yarn build:umd:r50k_base && yarn build:umd:o200k_base && yarn build:umd:o200k_harmony","build:umd:cl100k_base":"beemo webpack --entry='./src/main.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_cl100k_base' --env 'filename=cl100k_base.js'","build:umd:p50k_base":"beemo webpack --entry='./src/encoding/p50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_base' --env 'filename=p50k_base.js'","build:umd:p50k_edit":"beemo webpack --entry='./src/encoding/p50k_edit.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_edit' --env 'filename=p50k_edit.js'","build:umd:r50k_base":"beemo webpack --entry='./src/encoding/r50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_r50k_base' --env 'filename=r50k_base.js'","build:umd:o200k_base":"beemo webpack --entry='./src/encoding/o200k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_o200k_base' --env 'filename=o200k_base.js'","build:umd:o200k_harmony":"beemo webpack --entry='./src/encoding/o200k_harmony.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_o200k_harmony' --env 'filename=o200k_harmony.js'","clean":"git clean -dfX --exclude=node_modules src && beemo typescript:sync-project-refs","format":"yarn rrun prettier --write \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","postinstallDev":"yarn prepare","prepare":"rrun husky install .config/husky && beemo create-config && echo '\n**/*.gen.ts\nsrc/models.ts' >> .prettierignore","release":"beemo run-script release","test":"yarn test:format && yarn test:types && yarn test:lint && yarn test:code","test:code":"vitest","test:format":"yarn rrun prettier --check \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\" --ignore-path .prettierignore","test:lint":"rrun eslint 'src/*.{js,jsx,ts,tsx}'","test:types":"yarn rrun tsc --noEmit"},"release":{"branches":["+([0-9])?(.{+([0-9]),x}).x","master",{"name":"main","channel":false},"next",{"name":"beta","prerelease":true},{"name":"alpha","prerelease":true}],"tagFormat":"${version}"},"devDependencies":{"@edge-runtime/vm":"^5.0.0","@niieani/scaffold":"^1.7.54","@swc/cli":"^0.7.8","@swc/core":"1.13.5","devalue":"^5.3.2","node-resolve-ts":"^1.0.2","typescript":"^5.9.3","vitest":"^3.2.4"},"resolutions":{"typescript":"5.8.3","prettier":"^3"},"packageManager":"yarn@4.10.3","publishConfig":{"access":"public"},"types":"./esm/main.d.ts","gitHead":"59422fd8b68987dc0a207e43c48198c1346ecc50","_id":"gpt-tokenizer@3.3.0","_nodeVersion":"22.21.1","_npmVersion":"8.19.4","dist":{"integrity":"sha512-3cWaYQkQkMoqVR6h1lIsE+fjvasElSfl48+Gy2/88gCFvzXPYGFr0Cl6Ac483rvsXcsdBRyDxFDPuVIB+bsCzw==","shasum":"b7a687d9d8441d5913d390796612eec0335b5bc4","tarball":"http://123.232.10.234:8212/nexus/content/groups/npm-public/gpt-tokenizer/-/gpt-tokenizer-3.3.0.tgz","fileCount":1327,"unpackedSize":52856533,"signatures":[{"keyid":"SHA256:DhQ8wR5APBvFHLF/+Tc+AYvPOdTpcIDqOhxsBHRwC7U","sig":"MEUCIQDkuuCgHsBeTCNZKdbsZkawKC4opqNKyxaJB4kiYa1LdQIgUpDCsIOmQOlYLcZ9NGmWwak+nUZgdH22lu8VDjKC7KU="}],"size":17639896},"_npmUser":{"name":"anonymous","email":"npm@invent.life"},"directories":{},"maintainers":[{"name":"anonymous","email":"npm@invent.life"}],"_npmOperationalInternal":{"host":"s3://npm-registry-packages-npm-production","tmp":"tmp/gpt-tokenizer_3.3.0_1762474275611_0.9645018954846218"},"_hasShrinkwrap":false,"_cnpmcore_publish_time":"2025-11-07T00:11:16.091Z","publish_time":1762474276091,"_source_registry_name":"default"},"3.4.0":{"name":"gpt-tokenizer","version":"3.4.0","description":"A pure JavaScript implementation of a BPE tokenizer (Encoder/Decoder) for GPT-2 / GPT-3 / GPT-4 and other OpenAI models","keywords":["BPE","encoder","decoder","tokenizer","GPT","GPT-2","GPT-3","GPT-3.5","GPT-4","GPT-4o","NLP","Natural Language Processing","Text Generation","OpenAI","Machine Learning","ml"],"homepage":"https://github.com/niieani/gpt-tokenizer#readme","bugs":{"url":"https://github.com/niieani/gpt-tokenizer/issues"},"repository":{"type":"git","url":"git+https://github.com/niieani/gpt-tokenizer.git"},"license":"MIT","author":{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"},"contributors":[{"name":"Bazyli Brzoska","email":"npm@invent.life","url":"https://github.com/niieani"}],"exports":{".":{"import":"./esm/main.js","require":"./cjs/main.js"},"./*":{"import":"./esm/*.js","require":"./cjs/*.js"},"./cjs":{"require":"./cjs/main.js"},"./cjs/*":{"require":"./cjs/*.js"},"./esm/*":{"import":"./esm/*.js"},"./data/*":{"import":"./data/*","require":"./data/*"},"./package.json":"./package.json"},"main":"esm/main.js","unpkg":"dist/cl100k_base.js","module":"esm/main.js","source":"src/main.ts","scripts":{"codegen":"yarn codegen:bpe && yarn codegen:chat-enabled && yarn codegen:models","codegen:models":"rm -rf src/model && node --experimental-transform-types --import node-resolve-ts/register src/codegen/generateByModel.ts","codegen:bpe":"rm -rf src/bpeRanks && node --experimental-transform-types --import node-resolve-ts/register src/codegen/generateJsBpe.ts","codegen:chat-enabled":"rm -rf src/chat && node --experimental-transform-types --import node-resolve-ts/register src/codegen/generateChatEnabled.ts","build":"yarn build:cjs && yarn build:esm && yarn build:umd","build:cjs":"yarn rrun tsc --outDir cjs --module commonjs --target es2022 --project tsconfig-cjs.json","build:esm":"mkdir -p esm && echo '{\"name\": \"gpt-tokenizer\", \"type\": \"module\"}' > ./esm/package.json && yarn rrun tsc --outDir esm --target es2022","build:umd":"yarn build:umd:cl100k_base && yarn build:umd:p50k_base && yarn build:umd:p50k_edit && yarn build:umd:r50k_base && yarn build:umd:o200k_base && yarn build:umd:o200k_harmony","build:umd:cl100k_base":"beemo webpack --entry='./src/main.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_cl100k_base' --env 'filename=cl100k_base.js'","build:umd:p50k_base":"beemo webpack --entry='./src/encoding/p50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_base' --env 'filename=p50k_base.js'","build:umd:p50k_edit":"beemo webpack --entry='./src/encoding/p50k_edit.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_p50k_edit' --env 'filename=p50k_edit.js'","build:umd:r50k_base":"beemo webpack --entry='./src/encoding/r50k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_r50k_base' --env 'filename=r50k_base.js'","build:umd:o200k_base":"beemo webpack --entry='./src/encoding/o200k_base.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_o200k_base' --env 'filename=o200k_base.js'","build:umd:o200k_harmony":"beemo webpack --entry='./src/encoding/o200k_harmony.ts' --env 'outDir=dist' --env 'moduleTarget=umd' --env 'engineTarget=web' --env 'codeTarget=es2022' --env 'name=GPTTokenizer_o200k_harmony' --env 'filename=o200k_harmony.js'","clean":"git clean -dfX --exclude=node_modules src && beemo typescript:sync-project-refs","format":"yarn rrun prettier --write \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\"","postinstallDev":"yarn prepare","prepare":"rrun husky install .config/husky && beemo create-config && echo '\n**/*.gen.ts\nsrc/models.ts' >> .prettierignore","release":"beemo run-script release","test":"yarn test:format && yarn test:types && yarn test:lint && yarn test:code","test:code":"vitest","test:format":"yarn rrun prettier --check \"./{src,tests,.config}/**/!(*.d).{.js,jsx,ts,tsx,json,md}\" --ignore-path .prettierignore","test:lint":"rrun eslint 'src/*.{js,jsx,ts,tsx}'","test:types":"yarn rrun tsc --noEmit"},"release":{"branches":["+([0-9])?(.{+([0-9]),x}).x","master",{"name":"main","channel":false},"next",{"name":"beta","prerelease":true},{"name":"alpha","prerelease":true}],"tagFormat":"${version}"},"devDependencies":{"@edge-runtime/vm":"^5.0.0","@niieani/scaffold":"^1.7.54","@swc/cli":"^0.7.8","@swc/core":"1.13.5","devalue":"^5.3.2","node-resolve-ts":"^1.0.2","typescript":"^5.9.3","vitest":"^3.2.4"},"resolutions":{"typescript":"5.8.3","prettier":"^3"},"packageManager":"yarn@4.10.3","publishConfig":{"access":"public"},"types":"./esm/main.d.ts","gitHead":"7f880f46bb34a644ec8f9b3069060b2d2f99e11c","_id":"gpt-tokenizer@3.4.0","_nodeVersion":"22.21.1","_npmVersion":"8.19.4","dist":{"integrity":"sha512-wxFLnhIXTDjYebd9A9pGl3e31ZpSypbpIJSOswbgop5jLte/AsZVDvjlbEuVFlsqZixVKqbcoNmRlFDf6pz/UQ==","shasum":"4caa73edfdaa649f01dfa84f930e8a320ac0b1b1","tarball":"http://123.232.10.234:8212/nexus/content/groups/npm-public/gpt-tokenizer/-/gpt-tokenizer-3.4.0.tgz","fileCount":1348,"unpackedSize":53103516,"signatures":[{"keyid":"SHA256:DhQ8wR5APBvFHLF/+Tc+AYvPOdTpcIDqOhxsBHRwC7U","sig":"MEYCIQD6KNOA0b3dQC0fl2P8XFLiYP0tmz0PZytMOEpc3IVJSQIhAKqwOozFUT9e6UxDWkFbBS23KVIf4X50S9P76usqGCDi"}],"size":17682451},"_npmUser":{"name":"anonymous","email":"npm@invent.life"},"directories":{},"maintainers":[{"name":"anonymous","email":"npm@invent.life"}],"_npmOperationalInternal":{"host":"s3://npm-registry-packages-npm-production","tmp":"tmp/gpt-tokenizer_3.4.0_1762546505793_0.9224176201552261"},"_hasShrinkwrap":false,"_cnpmcore_publish_time":"2025-11-07T20:15:06.227Z","publish_time":1762546506227,"_source_registry_name":"default"}},"dist-tags":{"beta":"2.0.0-beta.2","latest":"3.4.0"},"name":"gpt-tokenizer","time":{"created":"2023-04-16T06:35:27.432Z","modified":"2025-11-07T20:15:24.802Z","1.0.0":"2023-04-16T06:35:22.514Z","1.0.1":"2023-04-16T06:44:19.168Z","1.0.2":"2023-04-18T08:16:31.211Z","1.0.3":"2023-04-18T08:35:56.926Z","1.0.4":"2023-05-01T05:44:14.060Z","1.0.5":"2023-05-16T07:19:36.459Z","2.0.0-beta.1":"2023-05-23T16:59:09.105Z","2.0.0-beta.2":"2023-05-24T01:40:25.015Z","2.0.0":"2023-05-24T02:03:44.836Z","2.1.0":"2023-06-01T07:42:08.442Z","2.1.1":"2023-06-01T08:05:44.064Z","2.1.2":"2023-10-07T00:57:25.208Z","2.2.0":"2024-07-17T23:04:30.554Z","2.2.1":"2024-07-18T02:01:18.120Z","2.2.2":"2024-09-15T23:11:16.857Z","2.2.3":"2024-09-16T04:27:29.442Z","2.3.0":"2024-09-20T02:37:35.224Z","2.4.0":"2024-09-23T02:47:12.850Z","2.4.1":"2024-10-07T03:27:08.175Z","2.5.0":"2024-10-09T06:09:10.316Z","2.5.1":"2024-10-21T03:27:52.559Z","2.6.0":"2024-11-04T05:56:03.078Z","2.6.1":"2024-11-11T00:38:30.681Z","2.6.2":"2024-11-13T06:26:12.906Z","2.7.0":"2024-11-28T01:50:32.288Z","2.8.0":"2024-12-09T09:43:48.404Z","2.8.1":"2024-12-09T10:11:50.050Z","2.9.0":"2025-03-05T02:40:40.480Z","3.0.0":"2025-06-07T22:51:01.186Z","3.0.1":"2025-06-13T05:04:11.499Z","3.1.0":"2025-10-09T22:47:21.190Z","3.2.0":"2025-10-09T23:12:02.948Z","3.3.0":"2025-11-07T00:11:16.091Z","3.4.0":"2025-11-07T20:15:06.227Z"},"readme":"# gpt-tokenizer\n\n[![NPM version](https://img.shields.io/npm/v/gpt-tokenizer?style=flat-square)](https://www.npmjs.com/package/gpt-tokenizer)\n[![NPM downloads](https://img.shields.io/npm/dm/gpt-tokenizer?style=flat-square)](https://www.npmjs.com/package/gpt-tokenizer)\n[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg?style=flat-square)](https://opensource.org/licenses/MIT)\n[![Build Status](https://img.shields.io/github/actions/workflow/status/niieani/gpt-tokenizer/ci-cd.yml?branch=main&style=flat-square)](https://github.com/niieani/gpt-tokenizer/actions)\n\n`gpt-tokenizer` is a Token Byte Pair Encoder/Decoder supporting all OpenAI's models (including GPT-5, GPT-4o, o1, o3, o4, GPT-4.1 and older models like GPT-3.5, GPT-4).\nIt's the [_fastest, smallest and lowest footprint_](#benchmarks) GPT tokenizer available for all JavaScript environments and is written in TypeScript.\n\n> Try it out in the **[playground](https://gpt-tokenizer.dev/)**!\n\nThis library has been trusted by:\n\n- [CodeRabbit](https://www.coderabbit.ai/) (sponsor 🩷)\n- Microsoft ([Teams](https://github.com/microsoft/teams-ai), [GenAIScript](https://github.com/microsoft/genaiscript/))\n- Elastic ([Kibana](https://github.com/elastic/kibana))\n- [Effect TS](https://effect.website/)\n- [Rivet](https://github.com/Ironclad/rivet) by Ironclad\n\nPlease consider [🩷 sponsoring](https://github.com/sponsors/niieani) the project if you find it useful.\n\n#### Features\n\nIt is the most feature-complete, open-source GPT tokenizer on NPM. This package is a port of OpenAI's [tiktoken](https://github.com/openai/tiktoken), with some additional, unique features sprinkled on top:\n\n- Support for easily tokenizing chats thanks to the `encodeChat` function\n- Support for all current OpenAI models (available encodings: `r50k_base`, `p50k_base`, `p50k_edit`, `cl100k_base`, `o200k_base`, and `o200k_harmony`)\n- Can be loaded and work synchronously! (i.e. in non async/await contexts)\n- Generator function versions of both the decoder and encoder functions\n- Provides the ability to decode an asynchronous stream of data (using `decodeAsyncGenerator` and `decodeGenerator` with any iterable input)\n- No global cache (no accidental memory leaks, as with the original GPT-3-Encoder implementation)\n- Includes a highly performant `isWithinTokenLimit` function to assess token limit without encoding the entire text/chat\n- Built-in cost estimation with the `estimateCost` function for calculating API usage costs\n- Full library of OpenAI models with comprehensive pricing information (see [`src/models.ts`](./src/models.ts) and [`src/models.gen.ts`](./src/models.gen.ts))\n- Improves overall performance by eliminating transitive arrays\n- Type-safe (written in TypeScript)\n- Works in the browser out-of-the-box\n\n## Installation\n\n### As NPM package\n\n```bash\nnpm install gpt-tokenizer\n```\n\n### As a UMD module\n\n```html\n<script src=\"https://unpkg.com/gpt-tokenizer\"></script>\n\n<script>\n  // the package is now available as a global:\n  const { encode, decode } = GPTTokenizer_cl100k_base\n</script>\n```\n\nIf you wish to use a custom encoding, fetch the relevant script.\n\n- https://unpkg.com/gpt-tokenizer/dist/o200k_base.js (for all modern models, such as `gpt-5`, `gpt-4o`, `gpt-4.1`, `o1` and others)\n- https://unpkg.com/gpt-tokenizer/dist/o200k_harmony.js (for open-weight Harmony models such as `gpt-oss-20b` and `gpt-oss-120b`)\n- https://unpkg.com/gpt-tokenizer/dist/cl100k_base.js (for `gpt-4` and `gpt-3.5`)\n- https://unpkg.com/gpt-tokenizer/dist/p50k_base.js\n- https://unpkg.com/gpt-tokenizer/dist/p50k_edit.js\n- https://unpkg.com/gpt-tokenizer/dist/r50k_base.js\n\nThe global name is a concatenation: `GPTTokenizer_${encoding}`.\n\nRefer to [supported models and their encodings](#Supported-models-and-their-encodings) section for more information.\n\n## Playground\n\nThe playground is published under a memorable URL: https://gpt-tokenizer.dev/\n\n[![GPT Tokenizer Playground](./docs/gpt-tokenizer.png)](https://gpt-tokenizer.dev/)\n\n## Usage\n\nThe library provides various functions to transform text into (and from) a sequence of integers (tokens) that can be fed into an LLM model. The transformation is done using a Byte Pair Encoding (BPE) algorithm used by OpenAI.\n\n```typescript\nimport {\n  encode,\n  encodeChat,\n  decode,\n  isWithinTokenLimit,\n  encodeGenerator,\n  decodeGenerator,\n  decodeAsyncGenerator,\n  ALL_SPECIAL_TOKENS,\n} from 'gpt-tokenizer'\n// note: depending on the model, import from the respective file, e.g.:\n// import {...} from 'gpt-tokenizer/model/gpt-4o'\n\nconst text = 'Hello, world!'\nconst tokenLimit = 10\n\n// Encode text into tokens\nconst tokens = encode(text)\n\n// Decode tokens back into text\nconst decodedText = decode(tokens)\n\n// Check if text is within the token limit\n// returns false if the limit is exceeded, otherwise returns the actual number of tokens (truthy value)\nconst withinTokenLimit = isWithinTokenLimit(text, tokenLimit)\n\n// Allow special tokens when needed\nconst withinTokenLimitWithSpecial = isWithinTokenLimit(text, tokenLimit, {\n  allowedSpecial: ALL_SPECIAL_TOKENS,\n})\n\n// Example chat:\nconst chat = [\n  { role: 'system', content: 'You are a helpful assistant.' },\n  { role: 'assistant', content: 'gpt-tokenizer is awesome.' },\n] as const\n\n// Encode chat into tokens\nconst chatTokens = encodeChat(chat)\n\n// Check if chat is within the token limit\nconst chatWithinTokenLimit = isWithinTokenLimit(chat, tokenLimit)\n\nconst chatWithinTokenLimitWithSpecial = isWithinTokenLimit(chat, tokenLimit, {\n  allowedSpecial: ALL_SPECIAL_TOKENS,\n})\n\n// Encode text using generator\nfor (const tokenChunk of encodeGenerator(text)) {\n  console.log(tokenChunk)\n}\n\n// Decode tokens using generator\nfor (const textChunk of decodeGenerator(tokens)) {\n  console.log(textChunk)\n}\n\n// Decode tokens using async generator\n// (assuming `asyncTokens` is an AsyncIterableIterator<number>)\nfor await (const textChunk of decodeAsyncGenerator(asyncTokens)) {\n  console.log(textChunk)\n}\n```\n\nBy default, importing from `gpt-tokenizer` uses `o200k_base` encoding, used by all modern OpenAI models, including `gpt-4o`, `gpt-4.1`, `o1`, etc.\n\nTo get a tokenizer for a different model, import it directly, for example:\n\n```ts\nimport {\n  encode,\n  decode,\n  isWithinTokenLimit,\n  // etc...\n} from 'gpt-tokenizer/model/gpt-3.5-turbo'\n```\n\nIf you're dealing with a resolver that doesn't support package.json `exports` resolution, you might need to import from the respective `cjs` or `esm` directory, e.g.:\n\n```ts\nimport {\n  encode,\n  decode,\n  isWithinTokenLimit,\n  // etc...\n} from 'gpt-tokenizer/cjs/model/gpt-3.5-turbo'\n```\n\n#### Lazy loading\n\nIf you don't mind loading the tokenizer asynchronously, you can use a dynamic import inside your function, like so:\n\n```ts\nconst {\n  encode,\n  decode,\n  isWithinTokenLimit,\n  // etc...\n} = await import('gpt-tokenizer/model/gpt-3.5-turbo')\n```\n\n#### Loading an encoding\n\nIf your model isn't supported by the package, but you know which BPE encoding it uses, you can load the encoding directly, e.g.:\n\n```ts\nimport {\n  encode,\n  decode,\n  isWithinTokenLimit,\n  // etc...\n} from 'gpt-tokenizer/encoding/cl100k_base'\n```\n\n### Supported models and their encodings\n\nWe support all OpenAI models, including the latest ones, with the following encodings:\n\n- `o`-series models, like `o1-*`, `o3-*` and `o4-*` (`o200k_base`)\n- `gpt-4o` (`o200k_base`)\n- `gpt-oss-*` (`o200k_harmony`)\n- `gpt-4-*` (`cl100k_base`)\n- `gpt-3.5-*` (`cl100k_base`)\n- `text-davinci-003` (`p50k_base`)\n- `text-davinci-002` (`p50k_base`)\n- `text-davinci-001` (`r50k_base`)\n- ...and many other models, see [models.ts](./src/models.ts) for an up-to-date list of supported models and their encodings.\n\nIf you don't see the model you're looking for, the default encoding is probably the one you want.\n\n## API\n\n### `encode(text: string, encodeOptions?: EncodeOptions): number[]`\n\nEncodes the given text into a sequence of tokens. Use this method when you need to transform a piece of text into the token format that the GPT models can process.\n\nThe optional `encodeOptions` parameter allows you to specify special token handling (see [special tokens](#special-tokens)).\n\nExample:\n\n```typescript\nimport { encode } from 'gpt-tokenizer'\n\nconst text = 'Hello, world!'\nconst tokens = encode(text)\n```\n\n### `decode(tokens: number[]): string`\n\nDecodes a sequence of tokens back into text. Use this method when you want to convert the output tokens from GPT models back into human-readable text.\n\nExample:\n\n```typescript\nimport { decode } from 'gpt-tokenizer'\n\nconst tokens = [18435, 198, 23132, 328]\nconst text = decode(tokens)\n```\n\n### `isWithinTokenLimit(text: string | Iterable<ChatMessage>, tokenLimit: number, encodeOptions?: EncodeOptions): false | number`\n\nChecks if the input is within the token limit. Returns `false` if the limit is exceeded, otherwise returns the number of tokens. Use this method to quickly check if a given text or chat is within the token limit imposed by GPT models, without encoding the entire input. The optional `encodeOptions` parameter lets you configure special token handling.\n\nExample:\n\n```typescript\nimport { isWithinTokenLimit, ALL_SPECIAL_TOKENS } from 'gpt-tokenizer'\n\nconst text = 'Hello, world!'\nconst tokenLimit = 10\nconst withinTokenLimit = isWithinTokenLimit(text, tokenLimit)\n\nconst withinTokenLimitWithSpecial = isWithinTokenLimit(text, tokenLimit, {\n  allowedSpecial: ALL_SPECIAL_TOKENS,\n})\n```\n\n### `countTokens(text: string | Iterable<ChatMessage>, encodeOptions?: EncodeOptions): number`\n\nCounts the number of tokens in the input text or chat. Use this method when you need to determine the number of tokens without checking against a limit.\nThe optional `encodeOptions` parameter allows you to specify custom sets of allowed or disallowed special tokens.\n\nExample:\n\n```typescript\nimport { countTokens } from 'gpt-tokenizer'\n\nconst text = 'Hello, world!'\nconst tokenCount = countTokens(text)\n```\n\n### `countChatCompletionTokens(request: ChatCompletionRequest): number`\n\nCounts the tokens that a function-calling chat completion request will consume, including message overhead, optional function definitions, and pinned function calls. This helper is only available on models that support the `function_calling` feature.\n\nExample:\n\n```typescript\nimport {\n  countChatCompletionTokens,\n  type ChatCompletionRequest,\n} from 'gpt-tokenizer/model/gpt-4o'\n\nconst request: ChatCompletionRequest = {\n  messages: [\n    { role: 'system', content: 'You are a helpful assistant.' },\n    { role: 'user', content: 'Find the weather for San Francisco.' },\n  ],\n  functions: [\n    {\n      name: 'get_weather',\n      description: 'Look up the weather for a city.',\n      parameters: {\n        type: 'object',\n        required: ['city'],\n        properties: {\n          city: { type: 'string' },\n          unit: { type: 'string', enum: ['celsius', 'fahrenheit'] },\n        },\n      },\n    },\n  ],\n}\n\nconst promptTokenEstimate = countChatCompletionTokens(request)\n```\n\nYou can also access the helper from the module's default export:\n\n```typescript\nimport gpt4o from 'gpt-tokenizer/model/gpt-4o'\n\n// Reuse the `request` defined above\nconst tokenCount = gpt4o.countChatCompletionTokens?.(request)\n```\n\n### `encodeChat(chat: ChatMessage[], model?: ModelName, encodeOptions?: EncodeOptions): number[]`\n\nEncodes the given chat into a sequence of tokens. The optional `encodeOptions` parameter lets you configure special token handling.\n\nIf you didn't import the model version directly, or if `model` wasn't provided during initialization, it must be provided here to correctly tokenize the chat for a given model. Use this method when you need to transform a chat into the token format that the GPT models can process.\n\nExample:\n\n```typescript\nimport { encodeChat } from 'gpt-tokenizer'\n\nconst chat = [\n  { role: 'system', content: 'You are a helpful assistant.' },\n  { role: 'assistant', content: 'gpt-tokenizer is awesome.' },\n]\nconst tokens = encodeChat(chat)\n```\n\nNote that if you encode an empty chat, it will still contain the minimum number of special tokens.\n\n### `encodeGenerator(text: string): Generator<number[], void, undefined>`\n\nEncodes the given text using a generator, yielding chunks of tokens.\nUse this method when you want to encode text in chunks, which can be useful for processing large texts or streaming data.\n\nExample:\n\n```typescript\nimport { encodeGenerator } from 'gpt-tokenizer'\n\nconst text = 'Hello, world!'\nconst tokens = []\nfor (const tokenChunk of encodeGenerator(text)) {\n  tokens.push(...tokenChunk)\n}\n```\n\n### `encodeChatGenerator(chat: Iterator<ChatMessage>, model?: ModelName): Generator<number[], void, undefined>`\n\nSame as `encodeChat`, but uses a generator as output, and may use any iterator as the input `chat`.\n\n### `decodeGenerator(tokens: Iterable<number>): Generator<string, void, undefined>`\n\nDecodes a sequence of tokens using a generator, yielding chunks of decoded text.\nUse this method when you want to decode tokens in chunks, which can be useful for processing large outputs or streaming data.\n\nExample:\n\n```typescript\nimport { decodeGenerator } from 'gpt-tokenizer'\n\nconst tokens = [18435, 198, 23132, 328]\nlet decodedText = ''\nfor (const textChunk of decodeGenerator(tokens)) {\n  decodedText += textChunk\n}\n```\n\n### `decodeAsyncGenerator(tokens: AsyncIterable<number>): AsyncGenerator<string, void, undefined>`\n\nDecodes a sequence of tokens asynchronously using a generator, yielding chunks of decoded text. Use this method when you want to decode tokens in chunks asynchronously, which can be useful for processing large outputs or streaming data in an asynchronous context.\n\nExample:\n\n```javascript\nimport { decodeAsyncGenerator } from 'gpt-tokenizer'\n\nasync function processTokens(asyncTokensIterator) {\n  let decodedText = ''\n  for await (const textChunk of decodeAsyncGenerator(asyncTokensIterator)) {\n    decodedText += textChunk\n  }\n}\n```\n\n### `estimateCost(tokenCount: number, modelSpec?: ModelSpec): PriceData`\n\nEstimates the cost of processing a given number of tokens using the model's pricing data. This function calculates costs for different API usage types (main API, batch API) and cached tokens when available.\n\nThe function returns a `PriceData` object with the following structure:\n\n- `main`: Main API pricing with `input`, `output`, `cached_input`, and `cached_output` costs\n- `batch`: Batch API pricing with the same cost categories\n\nAll costs are calculated in USD based on the token count provided.\n\nExample:\n\n```typescript\nimport { estimateCost } from 'gpt-tokenizer/model/gpt-4o'\n\nconst tokenCount = 1000\nconst costEstimate = estimateCost(tokenCount)\n\nconsole.log('Main API input cost:', costEstimate.main?.input)\nconsole.log('Main API output cost:', costEstimate.main?.output)\nconsole.log('Batch API input cost:', costEstimate.batch?.input)\n```\n\nNote: The model spec must be available either through the model-specific import or by passing it as the second parameter. Cost information may not be available for all models.\n\n## Special tokens\n\nThere are a few special tokens that are used by the GPT models.\nNote that not all models support all of these tokens.\n\nBy default, **all special tokens are disallowed**.\n\nThe `encode`, `encodeGenerator`, `encodeChat`, `encodeChatGenerator`, `countTokens`, and `isWithinTokenLimit` functions accept an `EncodeOptions` parameter to customize special token handling:\n\n### Custom Allowed Sets\n\n`gpt-tokenizer` allows you to specify custom sets of allowed special tokens when encoding text. To do this, pass a\n`Set` containing the allowed special tokens as a parameter to the `encode` function:\n\n```ts\nimport {\n  EndOfPrompt,\n  EndOfText,\n  FimMiddle,\n  FimPrefix,\n  FimSuffix,\n  ImStart,\n  ImEnd,\n  ImSep,\n  encode,\n} from 'gpt-tokenizer'\n\nconst inputText = `Some Text ${EndOfPrompt}`\nconst allowedSpecialTokens = new Set([EndOfPrompt])\nconst encoded = encode(inputText, { allowedSpecialTokens })\nconst expectedEncoded = [8538, 2991, 220, 100276]\n\nexpect(encoded).toBe(expectedEncoded)\n```\n\nYou may also use a special shorthand for either disallowing or allowing all special tokens, by passing in the string `'all'`, e.g. `{ allowedSpecial: 'all' }`.\n\n### Custom Disallowed Sets\n\nSimilarly, you can specify custom sets of disallowed special tokens when encoding text. Pass a `Set`\ncontaining the disallowed special tokens as a parameter to the `encode` function:\n\n```ts\nimport { encode, EndOfText } from 'gpt-tokenizer'\n\nconst inputText = `Some Text ${EndOfText}`\nconst disallowedSpecial = new Set([EndOfText])\n// throws an error:\nconst encoded = encode(inputText, { disallowedSpecial })\n```\n\nIn this example, an Error is thrown, because the input text contains a disallowed special token.\n\nIf both `allowedSpecialTokens` and `disallowedSpecial` are provided, `disallowedSpecial` takes precedence.\n\n## Performance Optimization\n\n### LRU Merge Cache\n\nThe tokenizer uses an LRU (Least Recently Used) cache to improve encoding performance for similar strings. By default, it stores up to 100,000 merged token pairs. You can adjust this value to optimize for your specific use case:\n\n- Increasing the cache size will make encoding similar strings faster but consume more memory\n- Setting it to 0 will disable caching completely\n- For applications processing many unique strings, a smaller cache might be more efficient\n\nYou can modify the cache size using the `setMergeCacheSize` function:\n\n```ts\nimport { setMergeCacheSize } from 'gpt-tokenizer'\n\n// Set to 5000 entries\nsetMergeCacheSize(5000)\n\n// Disable caching completely\nsetMergeCacheSize(0)\n```\n\nThe cache is persisted between encoding calls. To explicitly clear the cache (e.g. to free up memory), use the `clearMergeCache` function:\n\n```ts\nimport { clearMergeCache } from 'gpt-tokenizer'\n\nclearMergeCache()\n```\n\n## Testing and Validation\n\n`gpt-tokenizer` includes a set of test cases in the [TestPlans.txt](./data/TestPlans.txt) file to ensure its compatibility with OpenAI's Python `tiktoken` library. These test cases validate the functionality and behavior of `gpt-tokenizer`, providing a reliable reference for developers.\n\nRunning the unit tests and verifying the test cases helps maintain consistency between the library and the original Python implementation.\n\n### Model Information\n\n`gpt-tokenizer` provides comprehensive data about all OpenAI models through the `models` export from [`gpt-tokenizer/models`](./src/models.ts). This includes detailed information about context windows, costs, training data cutoffs, and deprecation status.\n\nThe data is regularly maintained to match OpenAI's official documentation. Contributions to keep this data up-to-date are welcome - if you notice any discrepancies or have updates, please feel free to open a PR.\n\n## [Benchmarks](https://l8j6fv.csb.app/)\n\nSince version 2.4.0, `gpt-tokenizer` is the fastest tokenizer implementation available on NPM. It's even faster than the available WASM/node binding implementations.\nIt has the fastest encoding, decoding time and a tiny memory footprint. It also initializes faster than all other implementations.\n\nThe encodings themselves are also the smallest in size, due to the compact format they are stored in.\n\n![fastest benchmark](./docs/fastest.png)\n\n![lowest footprint benchmark](./docs/lowest-footprint.png)\n\n## License\n\nMIT\n\n## Contributing\n\nContributions are welcome! Please open a pull request or an issue to discuss your bug reports, or use the discussions feature for ideas or any other inquiries.\n\n## Thanks\n\nThanks to @dmitry-brazhenko's [SharpToken](https://github.com/dmitry-brazhenko/SharpToken), whose code was served as a reference for the port.\n\nHope you find the `gpt-tokenizer` useful in your projects!","users":{}}