A crawler refers to a program script that collects data from the Internet.
Climbing the sky and air, crawling when bored and eating chicken, and the comments can draw many interesting conclusions. Zhihu has a very interesting question-"What cool, interesting and useful things can be done by using crawler technology", and interested friends can search by themselves.
At present, it is the autumn recruitment period for major enterprises, and there is no suitable resume template. Just climb a wave with Python.
Please click to enter a picture description.
The attached code is as follows. Interested friends can try it:
Please click to enter a picture description.
data analysis
Capturing a large amount of data is only the first step. To make these data useful, you need to learn data analysis.
Only by cleaning, de-duplication, storage, analysis and visualization of data can a large number of data be presented in an easy-to-read form and the required information be obtained efficiently.
In this regard, it is recommended to learn Python libraries such as Numpy, Pandas and Matpoltlib.
Efficient and convenient, for example, the following visualization result is obtained by analyzing nearly 20,000 captured samples.
Please click to enter a picture description.
artificial intelligence
In fact, Python plays an irreplaceable role in both traditional machine learning and deep learning. Many machine learning libraries such as Scikit-learn make Python very advantageous.
Keras, TensorFlow, Pytorch and other mainstream deep learning frameworks also determine Python? The position of the chosen son in the field of deep learning.