1. NumPy (Numerical Python)
- The most powerful feature of NumPy is the n-dimensional array.
- It contains basic linear algebra functions, Fourier transforms, and tools for integration with other low-level languages.
Ref:
https://t.co/XY13ILXwSN
2. SciPy (Scientific Python)
- SciPy is built on NumPy.
- It is one of the most useful libraries for a variety of high-level science and engineering modules like discrete Fourier transform, Linear Algebra, Optimization, and Sparse matrices.
Ref:
https://t.co/ALTFqM2VUo
3. Matplotlib
- Matplotlib is a comprehensive library for creating static, animated, and interactive visualizations in Python.
- You can also use Latex commands to add math to your plot.
- Matplotlib makes hard things possible.
Ref:
https://t.co/zodOo2WzGx
4. Pandas
- Pandas is for structured data operations and manipulations.
- It is extensively used for data munging and preparation.
- Pandas were added relatively recently to Python and have been instrumental in boosting Pythonโs usage.
Ref:
https://t.co/IFzikVHht4
5. Scikit Learn
- Built on NumPy, SciPy, and matplotlib, this library contains a lot of efficient tools for machine learning and statistical modeling including classification, regression, clustering, and dimensionality reduction.
Ref:
https://t.co/TCaQXPvKkk
6. Statsmodels
- Statsmodels for statistical modeling.
- Statsmodels is a Python module that allows users to explore data, estimate statistical models, and perform statistical tests.
Ref:
https://t.co/5CXswFvpPx
7. Seaborn
- Seaborn for statistical data visualization.
- Seaborn is a library for making attractive and informative statistical graphics in Python. It is based on matplotlib.
- Seaborn aims to make visualization a central part of exploring.
Ref:
https://t.co/cSxJlr09mq
8. Blaze
- Blaze for extending the capability of Numpy and Pandas to distributed and streaming datasets.
- It can be used to access data from a multitude of sources including Bcolz, MongoDB, SQLAlchemy, Apache Spark, PyTables, etc.
Ref:
https://t.co/5NhpM0reaH
9. Scrapy
- Scrapy for web crawling.
- It is a very useful framework for getting specific patterns of data.
- It has the capability to start at a website home URL and then dig through web-pages within the website to gather information.
Ref:
https://t.co/iEYIazAd2B
10. SymPy
- SymPy for symbolic computation.
- It has wide-ranging capabilities from basic symbolic arithmetic to calculus, algebra, discrete mathematics, and quantum physics.
- Use for formatting the result of the computations as LaTeX code.
Ref :
https://t.co/hesVmRJLVj
Additional libraries, you might need:
- OS for Operating system and file operations.
- Networkx for graph-based data manipulations.
- Regular expressions for finding patterns in text data.
- BeautifulSoup for scrapping the web.