Loading...
{{notif_text}}
Summary: We are looking for an experienced team leader/senior developer for our development team which is building web-crawling software to extract and analyze data from the web. Overall responsibility: 1. Leading / Development a full stack of web crawler developers 2. Design & Development optimized reusable components for the web crawler 3. Handhold the development team and guide them Skills / Experience: Over 4 years of experience in application development Python-based web crawlers. Experience in development with the Scrappy framework is an added advantage Following are the detailed skill requirements. 1. Ability to understand business requirements and translate them into technical requirements 2. Strong proficiency in Python, 3. Expert-level object-oriented analysis and design skills 4. Strong knowledge of RDBMS concepts. 5. Senior-level technical experience with JavaScript/CSS/HTML5 and related libraries 6. Familiarity with RESTful APIs 7. Familiarity with code versioning tools (such as Git, SVN, and Mercurial)
-> Strong understanding of object oriented programming concepts.-> Expertise in any of the programming languages, preferably, python.-> Atleast 3 years of hands on experience in any of the web frameworks.-> Capable to design and develop end to end cloud based web application.-> Able to lead a team of developers, qa, devops and deliver the project as per the roadmap.-> Capable to convert product specification into technical design to be delivered by his team.-> Exterise in any business domain in which one has built cloud based web applications.
At Shopalyst, we are re-imagining how digital consumers discover and purchase products they love. We are building the technology and data infrastructure that help friction free purchases from any digital moment that inspires shopping. Job brief :Shopalyst SaaS Platform for #FullFunnelMarketing provides easy to use self serve capabilities for the modern marketer. We are currently looking for people to join our engineering team where internet scale, reliability, security, high performance and self-management drives almost every design decision that we take. Must have requirements: Core server side Technology Skills. An expert in one or more of :1. Python with exposure to web scraping tools/frameworks like Scrapy, Beautiful Soup, Selenium, Puppeteer etc.2. Excellent communication skills3. Good working knowledge in any NoSQL (preferable) or relational databases4. Good exposure to working on Linux/Unix systems.Nice to have requirements:1. Understanding of any search server (such as Solr, Elastic Search, etc.).2. NOSQL database technologies (e.g Cassandra) 3. Experience in designing internet scale systems 4. Leading and mentoring engineering team. 5. Proficient understanding of code versioning tools (such as Git, Mercurial or SVN).
Relevant set of skills● Good communication and collaboration skills with 4-7 years of experience.● Ability to code and script with strong grasp of CS fundamentals, excellent problem solving abilities.● Comfort with frequent, incremental code testing and deployment, Data management skills● Good understanding of RDBMS● Experience in building Data pipelines and processing large datasets .● Knowledge of building Web Scraping and data mining is a plus.● Working knowledge of open source tools such as mysql, Solr, ElasticSearch, Cassandra ( data stores )would be a plus.● Expert in Python programmingRole and responsibilities● Inclined towards working in a start-up environment.● Comfort with frequent, incremental code testing and deployment, Data management skills● Design and Build robust and scalable data engineering solutions for structured and unstructured data fordelivering business insights, reporting and analytics.● Expertise in troubleshooting, debugging, data completeness and quality issues and scaling overallsystem performance.● Build robust API ’s that powers our delivery points (Dashboards, Visualizations and other integrations).