I like Netpeak Spider because it is a stable tool. Initially, the very idea of scraping pages is impressive with its very concept. This makes it possible for both novice and professional experts to carry out a comprehensive analysis, after which relevant conclusions will be drawn.
It is useful primarily to the experts. It's like in a hospital you need to calculate a cardiogram of the heart - such parameters as canonical, redirects, X Robot Tag, that is that many tools are not able to do or do not think about it at all.
What I basically need now and to some extent limits me, this is the version for MacOS.
This is just a bomb tool. Why? Because it is scraping and provides information on almost all technical issues that are needed in modern SEO promotion.
The most favorite features in Netpeak Spider are calculating duplicate pages and, accordingly, internal PageRank. Of the features that I use when analyzing websites at the sales stage or at the production stage of the service and the SEO process itself, this is the correct installation of canonical, validation of the robots.txt file, X Robots Tag, no-index checking and so on.
Netpeak Spider wins its speed. This is on the one hand. On the other hand, version 3.2 is such advanced functionality, compared to the previous version and other tools.
I constantly observe the work of the team, the development of the project, I subscribe to the newsletter, I follow the news and see and understand that the guys are on the right track and know what they are doing.