Tokei Versions Save

Count your code, quickly.

v12.0.0

3 years ago

Introduction

Tokei is a fast and accurate code analysis CLI tool and library, allowing you to easily and quickly see how many blank lines, comments, and lines of code are in your codebase. All releases and work on Tokei and tokei.rs (the free companion badge service) are funded by the community through GitHub Sponsors.

You can always download the latest version of tokei through GitHub Releases or Cargo. Tokei is also available through other package managers, though they may not always contain the latest release.

cargo install tokei

What's New?

Tokei 12 comes with some of the biggest user facing changes since 1.0, now in the latest version tokei will now analyse and count multiple languages embedded in your source code as well as adding support for Jupyter Notebooks. Now for the first time is able to handle and display different languages contained in a single source file. This currently available for a limited set of languages, with plans to add more support for more in the future. The currently supported languages are;

HTML + Siblings (Vue, Svelte, Etc...)

Tokei will now analyse and report the source code contained in <script>, <style>, and <template> tags in HTML and other similar languages. Tokei will read the value of thetype attribute from the <script> tag and detects the appropriate language based on its MIME type or JavaScript if not present. Tokei will do the same for <style> and <template> except reading the lang attribute instead of type and defaulting to CSS and HTML each respectively.

Jupyter Notebooks

Tokei will now read Jupyter Notebook files (.ipynb) and will read the source code and markdown from Jupyter's JSON and output the analysed result.

Markdown

Tokei will now detect any code blocks marked with specified source language and count each as their respective languages or as Markdown if not present or not found. Now you can easily see how many code examples are included in your documentation.

Rust

Tokei will now detect blocks of rustdoc documentation (e.g. //////!) and parse them as markdown.

Verbatim Strings

Tokei is now also capable of handling "verbatim" strings, which are strings that do not accept escape sequences like \. Thanks to @NickHackman for providing the implementation! This is initially supported for C++, C#, F#, and Rust.

New Look

To be able to show these new features, tokei's output has been changed to look like below. For brevity the CLI only displays one level deep in each language, however the library's parser is fully recursive and you can get access to the complete report using the library or by outputting the JSON format.

===============================================================================
 Language            Files        Lines         Code     Comments       Blanks
===============================================================================
 BASH                    4           49           30           10            9
 JSON                    1         1332         1332            0            0
 Shell                   1           49           38            1           10
 TOML                    2           77           64            4            9
-------------------------------------------------------------------------------
 Markdown                5         1230            0          965          265
 |- JSON                 1           41           41            0            0
 |- Rust                 2           53           42            6            5
 |- Shell                1           22           18            0            4
 (Total)                           1346          101          971          274
-------------------------------------------------------------------------------
 Rust                   19         3349         2782          116          451
 |- Markdown            12          351            5          295           51
 (Total)                           3700         2787          411          502
===============================================================================
 Total                  32         6553         4352         1397          804
===============================================================================

This feature is not just limited to the default output of tokei. You can see it broken down by each file with the --files option.

===============================================================================
 Language            Files        Lines         Code     Comments       Blanks
===============================================================================
 Markdown                5         1230            0          965          265
 |- JSON                 1           41           41            0            0
 |- Rust                 2           53           42            6            5
 |- Shell                1           22           18            0            4
 (Total)                           1346          101          971          274
-------------------------------------------------------------------------------
 ./CODE_OF_CONDUCT.md                46            0           28           18
 ./CHANGELOG.md                     570            0          434          136
-- ./markdown.md --------------------------------------------------------------
 |- Markdown                          4            0            3            1
 |- Rust                              6            4            1            1
 |- (Total)                          10            4            4            2
-- ./README.md ----------------------------------------------------------------
 |- Markdown                        498            0          421           77
 |- Shell                            22           18            0            4
 |- (Total)                         520           18          421           81
-- ./CONTRIBUTING.md ----------------------------------------------------------
 |- Markdown                        112            0           79           33
 |- JSON                             41           41            0            0
 |- Rust                             46           38            4            4
 |- (Total)                         200           79           84           37
===============================================================================
 Total                   5         1346          101          971          274
===============================================================================

Breaking Changes

  • The JSON Output and format of Languages has changed.
  • The JSON feature has been removed and is now included by default.
  • Stats has been split into Report and CodeStats to better represent the separation between analysing a file versus a blob of code.

v11.2.1

3 years ago

v11.2.0

3 years ago
  • @alexmaco Added shebang and env detection for Crystal.
  • @NickHackman Updated both Vue and HTML to count CSS & JS comments as comments.
  • @XAMPPRocky renamed Perl6's display name to Rakudo.
  • @dbackeus Added erb extension for Ruby HTML.
  • @kobataiwan Tokei will now check for a configuration file in your home directory as well as your current and configuration directory.
  • @dependabot Updated dependencies

Added Languages

  • @alexmaco Dhall
  • @NickHackman Svelte
  • @athas Futhark
  • @morphy2k Gohtml
  • @LucasMW Headache
  • @rosasynstylae Tsx
  • @XAMPPRocky OpenType Feature Files

v11.1.1

3 years ago

v11.1.0

4 years ago

Added Languages

  • @rubdos Arduino

  • @LuqueDaniel Pan

  • @itkovian Ren'Py

Added LanguageType::shebangs, LanguageType::from_file_extension, and LanguageType::from_shebang. (@solanav)

v11.0.0

4 years ago

Added languages

  • @bwidawsk GNU Assembly, GDB Script
  • @isker Dust, Apache Velocity
  • @andreblanke FreeMarker

Thanks to some major internal refactoring, Tokei has received significant performance improvements, and is now one of the fastest code counters across any size of codebase. With Tokei 11 showing up to 40–60% faster results than tokei's previous version. To showcase the improvements I've highlighted benchmarks of counting five differently sized codebases. Redis (~220k lines), Rust (~16M lines), and the Unreal Engine (~37.5M lines). In every one of these benchmarks Tokei 11 performed the best by a noticeable margin.

All benchmarks were done on a 15-inch MacBook Pro, with a 2.7GHz Intel Core i7 processor and 16GB 2133 MHz LPDDR3 RAM running macOS Catalina 10.15.3. Your mileage may vary, All benchmarks were done using hyperfine, using default settings for all programs.

Tokei (~5k lines)

Note This benchmark is not accurate due to tokei and loc both taking less than 5ms to complete, as such there is a high degree of error between these times and should mostly be considered equivalent. However it is included because it is notable that scc takes nearly 3x as long to complete on smaller codebases. Graph comparing programs running on the tokei source code

Redis (~220k lines)

Graph comparing programs running on the redis source code

Rust (~16M lines)

Graph comparing programs running on the rust source code

Unreal (~37.5M lines)

Graph comparing programs running on the unreal source code

v10.1.2

4 years ago

v10.1.1

4 years ago

v10.1.0

4 years ago

v10.0.1

4 years ago