Job Details
- Work on Back-end distributed Python Crawler
Job Requirements
- You love Python
- You are interested in data farming and Crawlers
- Want to master scraping methodologies and frameworks such as Scrapy-Cluster, and ways to control these with tools like Celery, RabbitMQ or Redis, Kafka
- You are a Team Player, open minded.
- You can code on back-end and not afraid to make mistakes and you can laugh at yourself.
- You love challenges.
We offer
- Friendly, flexible, and fun working environment.
- Salary review every year or on performance review.
- We pay for courses you want to take.
- Product equity sharing opportunity. Make money as partner on the tool we developed up to 4%.
- Free lunch everyday.