tokenize: Simple tokenizer for English text
Simple tokenizer for English text.
Downloads
- tokenize-0.3.0.1.tar.gz [browse] (Cabal source package)
- Package description (as included in the package)
Maintainer's Corner
For package maintainers and hackage trustees
Candidates
Versions [RSS] | 0.1.0, 0.1.1, 0.1.2, 0.1.3, 0.2.0, 0.2.2, 0.3.0, 0.3.0.1 |
---|---|
Change log | CHANGELOG.md |
Dependencies | base (>=4 && <5), split (>=0.1), text [details] |
Tested with | ghc ==9.10.0, ghc ==9.8.2, ghc ==9.6.4, ghc ==9.4.8, ghc ==9.2.8, ghc ==9.0.2, ghc ==8.10.7, ghc ==8.8.4, ghc ==8.6.5, ghc ==8.4.4, ghc ==8.2.2, ghc ==8.0.2, ghc ==7.10.3 |
License | BSD-3-Clause |
Author | Grzegorz Chrupała |
Maintainer | Andreas Abel |
Category | Natural Language Processing |
Home page | https://github.com/haskell/tokenize |
Bug tracker | https://github.com/haskell/tokenize/issues |
Source repo | head: git clone https://github.com/haskell/tokenize |
Uploaded | by AndreasAbel at 2024-04-08T19:44:19Z |
Distributions | LTSHaskell:0.3.0.1, NixOS:0.3.0.1, Stackage:0.3.0.1 |
Reverse Dependencies | 4 direct, 0 indirect [details] |
Downloads | 8156 total (23 in the last 30 days) |
Rating | (no votes yet) [estimated by Bayesian average] |
Your Rating | |
Status | Docs available [build log] Last success reported on 2024-04-08 [all 1 reports] |