Originally published on my blog.
I’ve been using pylint for almost a decade now.
Fast-forward ten years later, and I’ve decided no longer use it.
Here’s why.
Introduction
Let’s start with an example. Consider the following, obviously incorrect code:
def foo():
...
if __name__ == " __main__"
foo(1, 2, 3)
Here’s what the output of pylint might look like when you run it:
$ pylint foo.py
foo.py:4: [E1121(too-many-function-args),]
Too many positional arguments for function call
Now let’s see a few problems I’ve encountered while using pylint.
Pain points
Initial setup
Initial setup of pylint is always a bit painful. However, if you follow some advice you can get through it.
False negatives
A recurring issue with pylint is the amount of false negatives. That is, when pylint thinks something is wrong but the code is perfectly OK.
For instance, I like using the attrs library whenever I have a class that mostly contains data, like so:
import attr
@attr.s
class Foo:
bar = attr.ib()
baz = attr.ib()
Those few lines of code give me a nice human-readable __repr__
, a complete set of comparison methods, sensible constructors (among other things), and without any boiler plate.
But when I run pylint
on this file I get:
foo.py:3: [R0903(too-few-public-methods), Foo] Too few public methods (0/2)
Well, it’s perfectly fine to require at least 2 public methods for every class you declare. Most of the time, when you have a class with just one public method it’s better to just have a function instead, like this:
# What you wrote:
class Greeter
def __init__ (self, name="world"):
self._name = name
def greet(self):
print("Hello", self.name)
# What you should have written instead:
def greet(name="world"):
print("Hello" , name)
But here pylint does not know about all the nice methods added “dynamically” by attr
and wrongly assumes our design is wrong.
Thus, if you run pylint during CI and you fail the build if any error is found, you have to insert a specially formatted comment to locally disable this warning:
import attr
# pylint: disable=too-few-public-methods
@attr.s
class Foo:
...
This gets old fast, especially because every time you upgrade pylint you get a new bunch of checks added. Sometimes they catch new problems in your code, but you still have to go through each and every new error to check if it’s a false positive or a real issue.
But so far I had managed to overcome those pain points. So what changed?
Turning the page
Two things happened:
First, I’ve started using mypy and a “real” type system 1.
What I found is that mypy can catch many of the errors pylint would catch, and probably more.
Also, since it uses type annotations mypy is both faster and more precise than pylint (because it does not have to “guess” anything).
Last but not least, mypy was also designed to be used gradually, emitting errors only when it is sure there’s something wrong.
Secondly, I decided to port one of my projects to Python3.7. I had to bump pylint from 1.9 to 2.1 (because older pylint versions do not support Python3.7), and I got 18 new pylint errors, which only one of them being actually relevant.
It was at this moment I decided to take a step back.
Categories
As we saw in those examples, the pylint error messages contain a short name for the error (like too-many-function-args
), and an numeric ID prefixed by a letter (E1121
).
Each letter corresponds to a pylint category.
Here is a complete list:
- (F)atal (something prevented pylint from running normally)
- (E)rror (serious bug)
- (W)arning (not so serious issue)
- (I)nfo (errors like being unable to parse a
# pylint: disable
comment) - (C)onvention (coding style)
- (R)efactoring (code that could be written in a clearer or more Pythonic way)
Note that Fatal and Info categories are only useful when we try to understand why pylint does not behave the way it should.
The rise of the linters
I realized I could use other linters (not just mypy) for almost every pylint category.
- Some of the Error messages can also be caught by pyflakes which is fast and produces very few false positive too.
- The Convention category can also be taken care of by pycodestyle.
- A few Refactoring warnings (but not all) can also be caught by mccabe, which measures code complexity.
So far I’ve been using all theses linters in addition to pylint, as explained in how I lint my Python
But what if I stopped using pylint altogether?
All I would lose would be some of the Refactoring messages, but I assumed most of them would get caught during code review. In exchange, I could get rid of all these noisy # pylint: disable
comments. (34 of them for about 5,000 lines of code)
And that’s how I stopped using pylint and removed it from my CI scripts. My apologies to pylint authors and maintainers: you did a really great job all these years, but I now believe it’s time for me to move on and use new and better tools instead.
What’s next
This is not the end of the story of my never-ending quest of tools to help me write better Python code. You can read the rest of the story in Hello falke8.
Thanks for reading this far :)
I'd love to hear what you have to say, so please feel free to leave a comment below, or read the feedback page for more ways to get in touch with me.
-
By the way, at the end of Giving mypy a go I said I was curious to know if mypy would help during a massive refactoring. Well, it did, even better than I would have hoped! ↩
Top comments (10)
Maybe you can give
wemake-python-styleguide
a try? It has even more rules than pylint, but does not even try to mess with types.It has way less false-positives and is based on
flake8
.wemake-services / wemake-python-styleguide
The strictest and most opinionated python linter ever!
wemake-python-styleguide
Welcome to the strictest and most opinionated python linter ever.
wemake-python-styleguide
is actually a flake8 plugin with some other plugins as dependencies.Quickstart
You will also need to create a
setup.cfg
file with the configuration.We highly recommend to also use:
Running
This app is still just good old
flake8
And it won't change your existing workflow.See "Usage" section in the docs for examples and integrations.
We also support Github Actions as first class-citizens Try it out!
What we are about
The ultimate goal of this project is to make all people write exactly the same
python
code.Cheers!
I tried it and it's a bit too strict for my taste - I got tons of errors to comb through.
That being said, I'm a bit tempted to start using it from day one for my next project and see how it goes :)
Awesome! Any feedback is appreciated.
Thank you for sharing your experience! I have to say that I personally don't like python and never really got into the language, but I find it interesting to see how others are working :)
What I don't understand though is, why use a dynamic language to avoid a type system and then add it again with a linter and maybe unit tests? Wouldn't it be better if the compiler did the work for you in the first place? And even if it might sound provocative, it is meant as an honest question. I mostly work with C# and F# and appreciate the support from the compiler to help me, of course at the cost of being constrained in what you can do. From my experience, the completely dynamic approach often is easy to get started, but hard to maintain and reactor in the long run, because you don't know where you have to apply a change in the data model, if something changes.
Good question.
First of, there are many reasons to use Python: personally I like the simplicity and power of function definitions. You can do pretty much anything you like: named parameters, defaults, variadic arguments, and you still don't have to care about overloading.
I also like the fact that in Python, there's little difference between the name of something and the actual thing.
And then there's the standard library, the ecosystem, the tooling, the community, the way the language evolves, the fact that it's really easy to learn, the test framework (pytest of course, not unittest) ...
So, back to you original question:
It's not just a dynamic language, it's a language I love
I'm not avoiding it. You can read more about that here
Because I can! Those tools are easy to set up and provide many cool features.
Whatever floats your boat :)
Personally I've been writing Python for so long it's hard for me to feel as productive in Python than in any other language, and the cost of the constraints is just too high for me.
That being said, lately I've been writing rust for fun and I'm starting to develop on crush on this language too ...
That's a long standing debate I don't really want to get into right now. Just let me say that based on my experience, the thing that help the most maintaining and refactoring large code bases in the long run are automated tests.
Yes, for sure I don't want to enter a religious debate here right now ;-)
As I said, I'm always interested in understand the perspective and experience of other developers.
And I fully support the conclusion in your linked post, that people tend to be bad at spotting mistakesin their code and we therefore should combine multiple techniques to reduce the number of errors.
For me, a more and more important part is support from the language and compiler, the often (probably too often) quoted "making illegal states unrepresentable". This and a type system can reduce the number of simple automated tests and gives more time to focus on testing the actual logic (again, my experience).
But of course, everyone is today what the own experience made him or her and there are good reasons for each view.
mypy won't catch everything, e.g. pylint can catch closure bugs:
Good point. I'll talk about this in a following post. Stay tuned !
And here's the new article, as promised: dmerej.info/blog/post/hello-flake8/
I'm getting tired of pylint as well, though sometimes it catches some pretty useful stuff, but there's a lot of noise as you said.
I'll take a look at mypy, thanks!