Python Jobs in Delhi, NCR and Gurgaon
Talented C++ Developer who has experience in design, development, debugging of multi-threaded large scale application with good understanding in data structures on Linux packaging, functional testing and deployment automation very good with problem solving.
Key responsibilities :
- Understand fundamental design principles and best practices for developing backend servers and web applications
- Gather requirements, scope functionality, estimate and translate those requirements into solutions
- Implement and integrate software features as per requirements
- Deliver across the entire app life cycle
- Work in a product creation project and/or technology project with implementation or integration responsibilities
- Improve an existing code base, if required, and ability to read source code to understand data flow and origin
- Design effective data storage for the task at hand and know how to optimize query performance along the way
- Follow an agile methodology of development and delivery
- Strictly adhere to coding standards and internal practices; must be able to conduct review code
- Mentor and possibly lead junior developers
- Contribute towards innovation
- Performance optimization of apps
- Explain technologies and solutions to technical and non-technical stakeholders
- Diagnose bugs and other issues in products
- Continuously discover, evaluate, and implement new technologies to maximize development efficiency
Must have / Good to have:
- 5-7years' experience with C++ development and relevant 3+yrs in modern version 11/14/17 would be a plus.
- Design and implementation of high-availability, and performance applications on Linux environment
- Advanced knowledge of C/C++, Object Oriented Design, STL
- Good with multithreading and data structures
- Develop back-end components to improve responsiveness and overall performance
- Familiarity with database design, integration with applications and python packaging.
- Familiarity with front-end technologies (like JavaScript and HTML5), REST API, security considerations
- Familiarity with functional testing and deployment automation frameworks
- Experience in development for 3-4 production ready application using C++ as programming language
- Experience in writing unit test cases including positive and negative test cases
- Experience of CI/CD pipeline code deployment (Git, SVN, Jenkins or Teamcity)
- Experience with Agile and DevOps methodology
- Very good problem-solving skills
- Experience with Web technologies is a plus.
Read less
Responsibilities:
● Take ownership of complex software architectures, and operational metrics, and run the day-to-day operations of the business.
● Define the interaction between different applications at systems level architecture
● Ramp up the team and provide architectural blueprints and technical leadership to our product engineering team
● Work closely with peers and product managers to develop the best technical design and approach for new product development.
● Contribute to the senior management team, and guide strategic decisions and resource allocation.
● Attract and retain top engineering talent. Ensure engineering excellence and rigor in architecture, execution, and delivery. Mentor and develop leaders within your team.
Requirements
● Proficient in at least one language used for web development; we primarily write in Python
● 4+ years of experience designing and building web applications (backend)
● Experience in building backend web applications and developing REST APIs
● Experience working with SQL and NoSQL databases
● Exposure to CI/CD pipelines
● [Bonus] Hands-on experience with big data frameworks like Spark is a plus
● [Bonus] Working knowledge of AWS or any other cloud platform
● [Bonus] Experience in ETL, Data Lake, and data warehouse pipelines
Good to have
● Hands-on experience with big data frameworks like Spark is a plus
● Working knowledge of AWS or any other cloud platform
● Experience in ETL, Data Lake, and data warehouse pipelines
Job Title: Data Engineer
Job Summary: As a Data Engineer, you will be responsible for designing, building, and maintaining the infrastructure and tools necessary for data collection, storage, processing, and analysis. You will work closely with data scientists and analysts to ensure that data is available, accessible, and in a format that can be easily consumed for business insights.
Responsibilities:
- Design, build, and maintain data pipelines to collect, store, and process data from various sources.
- Create and manage data warehousing and data lake solutions.
- Develop and maintain data processing and data integration tools.
- Collaborate with data scientists and analysts to design and implement data models and algorithms for data analysis.
- Optimize and scale existing data infrastructure to ensure it meets the needs of the business.
- Ensure data quality and integrity across all data sources.
- Develop and implement best practices for data governance, security, and privacy.
- Monitor data pipeline performance / Errors and troubleshoot issues as needed.
- Stay up-to-date with emerging data technologies and best practices.
Requirements:
Bachelor's degree in Computer Science, Information Systems, or a related field.
Experience with ETL tools like Matillion,SSIS,Informatica
Experience with SQL and relational databases such as SQL server, MySQL, PostgreSQL, or Oracle.
Experience in writing complex SQL queries
Strong programming skills in languages such as Python, Java, or Scala.
Experience with data modeling, data warehousing, and data integration.
Strong problem-solving skills and ability to work independently.
Excellent communication and collaboration skills.
Familiarity with big data technologies such as Hadoop, Spark, or Kafka.
Familiarity with data warehouse/Data lake technologies like Snowflake or Databricks
Familiarity with cloud computing platforms such as AWS, Azure, or GCP.
Familiarity with Reporting tools
Teamwork/ growth contribution
- Helping the team in taking the Interviews and identifying right candidates
- Adhering to timelines
- Intime status communication and upfront communication of any risks
- Tech, train, share knowledge with peers.
- Good Communication skills
- Proven abilities to take initiative and be innovative
- Analytical mind with a problem-solving aptitude
Good to have :
Master's degree in Computer Science, Information Systems, or a related field.
Experience with NoSQL databases such as MongoDB or Cassandra.
Familiarity with data visualization and business intelligence tools such as Tableau or Power BI.
Knowledge of machine learning and statistical modeling techniques.
If you are passionate about data and want to work with a dynamic team of data scientists and analysts, we encourage you to apply for this position.
Objective:
The objective of this position is to undertake enhancements on the CARE MIS including, but not limited to, its backend and interface design, in order for the MIS to effectively and efficiently serve a project management tool for CARE.
Qualifications:
Knowledge
At least Bachelor's Degree in computer science, information/communication technologies, or other related fields
Strong Knowledge of Content Management Systems front end and backend design
Strong technical knowledge of databases, schema, and web development
Strong knowledge of Python, JavaScript, CSS, PHP, HTML5, Java, Bootstrap 5, JQuery, etc.
Experience
At least 2 to 3 years of experience in software/database development and management, and analysis tools using relevant technologies
Demonstrable knowledge and experience of CakePHP
Demonstrable experience in web-based application development in any standard platform, preferably PHP; experience in database design (MySQL, Microsoft SQL, or Access)
Demonstrable experience in HTML5, Angular JS, Python, PHP, Perl, shell, and Linux/Unix distribution platforms
Demonstrable experience in enterprise IT design and knowledge of cybersecurity
Skills and abilities
Excellent analytical skills and problem-solving ability
English proficiency in communication, presentation, and report writing
Skillful in statistical analysis
Personal qualities
Excellent analytical and problem-solving skills; demonstrable ability to plan and organize work independently, with ability to prioritize work, meet deadlines and manage changing priorities; excellent interpersonal and communication skills; high commitment to responsibility and work quality; ability to work effectively and efficiently independently and/or within a multi-cultural team; openness/receptiveness to critique for enhancing work and outputs.
Location : D Block, Okhla Phase-1, New Delhi
Senior Data Scientist
Your goal: To improve the education process and improve the student experience through data.
The organization: Data Science for Learning Services Data Science and Machine Learning are core to Chegg. As a Student Hub, we want to ensure that students discover the full breadth of learning solutions we have to offer to get full value on their learning time with us. To create the most relevant and engaging interactions, we are solving a multitude of machine learning problems so that we can better model student behavior, link various types of content, optimize workflows, and provide a personalized experience.
The Role: Senior Data Scientist
As a Senior Data Scientist, you will focus on conducting research and development in NLP and ML. You will be responsible for writing production-quality code for data product solutions at Chegg. You will lead in identification and implementation of key projects to process data and knowledge discovery.
Responsibilities:
• Translate product requirements into AIML/NLP solutions
• Be able to think out of the box and be able to design novel solutions for the problem at hand
• Write production-quality code
• Be able to design data and annotation collection strategies
• Identify key evaluation metrics and release requirements for data products
• Integrate new data and design workflows
• Innovate, share, and educate team members and community
Requirements:
• Working experience in machine learning, NLP, recommendation systems, experimentation, or related fields, with a specialization in NLP • Working experience on large language models that cater to multiple tasks such as text generation, Q&A, summarization, translation etc is highly preferred
• Knowledge on MLOPs and deployment pipelines is a must
• Expertise on supervised, unsupervised and reinforcement ML algorithms.
• Strong programming skills in Python
• Top data wrangling skills using SQL or NOSQL queries
• Experience using containers to deploy real-time prediction services
• Passion for using technology to help students
• Excellent communication skills
• Good team player and a self-starter
• Outstanding analytical and problem-solving skills
• Experience working with ML pipeline products such as AWS Sagemaker, Google ML, or Databricks a plus.
Why do we exist?
Students are working harder than ever before to stabilize their future. Our recent research study called State of the Student shows that nearly 3 out of 4 students are working to support themselves through college and 1 in 3 students feel pressure to spend more than they can afford. We founded our business on provided affordable textbook rental options to address these issues. Since then, we’ve expanded our offerings to supplement many facets of higher educational learning through Chegg Study, Chegg Math, Chegg Writing, Chegg Internships, Thinkful Online Learning, and more, to support students beyond their college experience. These offerings lower financial concerns for students by modernizing their learning experience. We exist so students everywhere have a smarter, faster, more affordable way to student.
Video Shorts
Life at Chegg: https://jobs.chegg.com/Video-Shorts-Chegg-Services
Certified Great Place to Work!: http://reviews.greatplacetowork.com/chegg
Chegg India: http://www.cheggindia.com/
Chegg Israel: http://insider.geektime.co.il/organizations/chegg
Thinkful (a Chegg Online Learning Service): https://www.thinkful.com/about/#careers
Chegg out our culture and benefits!
http://www.chegg.com/jobs/benefits
https://www.youtube.com/watch?v=YYHnkwiD7Oo
Chegg is an equal-opportunity employer
About Us -Celebal Technologies is a premier software services company in the field of Data Science, Big Data and Enterprise Cloud. Celebal Technologies helps you to discover the competitive advantage by employing intelligent data solutions using cutting-edge technology solutions that can bring massive value to your organization. The core offerings are around "Data to Intelligence", wherein we leverage data to extract intelligence and patterns thereby facilitating smarter and quicker decision making for clients. With Celebal Technologies, who understands the core value of modern analytics over the enterprise, we help the business in improving business intelligence and more data-driven in architecting solutions.
Key Responsibilities
• As a part of the DevOps team, you will be responsible for configuration, optimization, documentation, and support of the CI/CD components.
• Creating and managing build and release pipelines with Azure DevOps and Jenkins.
• Assist in planning and reviewing application architecture and design to promote an efficient deployment process.
• Troubleshoot server performance issues & handle the continuous integration system.
• Automate infrastructure provisioning using ARM Templates and Terraform.
• Monitor and Support deployment, Cloud-based and On-premises Infrastructure.
• Diagnose and develop root cause solutions for failures and performance issues in the production environment.
• Deploy and manage Infrastructure for production applications
• Configure security best practices for application and infrastructure
Essential Requirements
• Good hands-on experience with cloud platforms like Azure, AWS & GCP. (Preferably Azure)
• Strong knowledge of CI/CD principles.
• Strong work experience with CI/CD implementation tools like Azure DevOps, Team city, Octopus Deploy, AWS Code Deploy, and Jenkins.
• Experience of writing automation scripts with PowerShell, Bash, Python, etc.
• GitHub, JIRA, Confluence, and Continuous Integration (CI) system.
• Understanding of secure DevOps practices
Good to Have -
• Knowledge of scripting languages such as PowerShell, Bash
• Experience with project management and workflow tools such as Agile, Jira, Scrum/Kanban, etc.
• Experience with Build technologies and cloud services. (Jenkins, TeamCity, Azure DevOps, Bamboo, AWS Code Deploy)
• Strong communication skills and ability to explain protocol and processes with team and management.
• Must be able to handle multiple tasks and adapt to a constantly changing environment.
• Must have a good understanding of SDLC.
• Knowledge of Linux, Windows server, Monitoring tools, and Shell scripting.
• Self-motivated; demonstrating the ability to achieve in technologies with minimal supervision.
• Organized, flexible, and analytical ability to solve problems creatively.
Mandatory Requirements
- Experience in AWS Glue
- Experience in Apache Parquet
- Proficient in AWS S3 and data lake
- Knowledge of Snowflake
- Understanding of file-based ingestion best practices.
- Scripting language - Python & pyspark
CORE RESPONSIBILITIES
- Create and manage cloud resources in AWS
- Data ingestion from different data sources which exposes data using different technologies, such as: RDBMS, REST HTTP API, flat files, Streams, and Time series data based on various proprietary systems. Implement data ingestion and processing with the help of Big Data technologies
- Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part of business logic and implement it using the language supported by the base data platform
- Develop automated data quality check to make sure right data enters the platform and verifying the results of the calculations
- Develop an infrastructure to collect, transform, combine and publish/distribute customer data.
- Define process improvement opportunities to optimize data collection, insights and displays.
- Ensure data and results are accessible, scalable, efficient, accurate, complete and flexible
- Identify and interpret trends and patterns from complex data sets
- Construct a framework utilizing data visualization tools and techniques to present consolidated analytical and actionable results to relevant stakeholders.
- Key participant in regular Scrum ceremonies with the agile teams
- Proficient at developing queries, writing reports and presenting findings
- Mentor junior members and bring best industry practices
QUALIFICATIONS
- 5-7+ years’ experience as data engineer in consumer finance or equivalent industry (consumer loans, collections, servicing, optional product, and insurance sales)
- Strong background in math, statistics, computer science, data science or related discipline
- Advanced knowledge one of language: Java, Scala, Python, C#
- Production experience with: HDFS, YARN, Hive, Spark, Kafka, Oozie / Airflow, Amazon Web Services (AWS), Docker / Kubernetes, Snowflake
- Proficient with
- Data mining/programming tools (e.g. SAS, SQL, R, Python)
- Database technologies (e.g. PostgreSQL, Redshift, Snowflake. and Greenplum)
- Data visualization (e.g. Tableau, Looker, MicroStrategy)
- Comfortable learning about and deploying new technologies and tools.
- Organizational skills and the ability to handle multiple projects and priorities simultaneously and meet established deadlines.
- Good written and oral communication skills and ability to present results to non-technical audiences
- Knowledge of business intelligence and analytical tools, technologies and techniques.
Familiarity and experience in the following is a plus:
- AWS certification
- Spark Streaming
- Kafka Streaming / Kafka Connect
- ELK Stack
- Cassandra / MongoDB
CI/CD: Jenkins, GitLab, Jira, Confluence other related tools
About the company
A strong cross-functional team of designers, software developers, and hardware experts who love creating technology products and services. We are not just an outsourcing partner, but with our deep expertise across several business verticals, we bring our best practices so that your product journey is like a breeze.
We love healthcare, medical devices, finance, and consumer electronics but we love almost everything where we can build technology products and services. In the past, we have created several niche and novel concepts and products for our customers, and we believe we still learn every day to widen our horizons!
Introduction - Advanced Technology Group
As an extension to solving the continuous medical education needs of doctors through the courses platform, Iksha Labs also developed several cutting-edge solutions for simulated training and education, including
- Virtual Reality and Augmented Reality based surgical simulations
- Hand and face-tracking-based simulations
- Remote immersive and collaborative training through Virtual Reality
- Machine learning-based auto-detection of clinical conditions from medical images
Introduction - Advanced Technology Group
As an extension to solving the continuous medical education needs of doctors through the courses platform, Iksha Labs developed several cutting-edge solutions for simulated training and education, including
- Virtual Reality and Augmented Reality based surgical simulations
- Hand and face-tracking-based simulations
- Remote immersive and collaborative training through Virtual Reality
- Machine learning-based auto-detection of clinical conditions from medical images
Job Description
The ideal candidate will be responsible for developing high-quality applications. They will also be responsible for designing and implementing testable and scalable code.
Key Skills/Technology
- Good command of C, and C++ with Algorithms and Data Structures
- Image Processing
- Qt (Expertise)
- Python (Expertise)
- Embedded Systems
- Good working knowledge of STL/Boost Algorithms and Data structures
Responsibilities
- Develop quality software and web applications
- Analyze and maintain existing software applications
- Develop scalable, testable code
- Discover and fix programming bugs
Qualifications
Bachelor's degree or equivalent experience in Computer Science/Electronics and Communication or a related field.
Industry Type
Medical / Healthcare
Functional Area
IT Software - Application Programming, Maintenance
ROLE: Full Stack Developer
About the company:
CogniTensor is an analytical software company that brings data to the heart of decision-making. CogniTensor leverages its product, DeepOptics - an integrated platform to implement 3A (Automation, Analytics and AI) at scale.
Cognitensor has customers ranging in Finance, Energy, Commodity, Retail & Manufacturing. More details can be found on our website: https://www.cognitensor.com/
Our strategic investors include Shell and CIIE.CO (IIM-A/Accenture).
Qualification & Experience:
- BE/B.Tech Degree in Computer Programming, Computer Science, or a related field.
- +2 years experience as a Software Developer.
- Hands on experience in developing finance applications is must
Roles & Responsibilities:
We are looking for a Full Stack Developer to produce scalable software solutions. You’ll be part of a cross-functional team that’s responsible for the full software development life cycle, from conception to deployment.
As a Full Stack Developer, you should be comfortable around both front-end and back-end coding languages, development frameworks and third-party libraries. You should also be a team player with a knack for visual design and utility. Along with familiar with Agile methodologies and testing skills
- Work with development teams and product managers to ideate software solutions
- Design client-side and server-side architecture
- Develop and manage well-functioning databases and applications
- Write effective APIs
- Write technical documentation
- Excellent communication and teamwork skills
Technical Skills:
Must Have
- React JS
- Git / Bitbucket,
- Express JS, Python, HTML, CSS, Node JS
- CI/CD like CircleCI
- Postgres or any DB knowledge
- Familiarity with AWS or Azure or both
Good to Have
- Docker
- Redux
- Android development
- React Native
- Electron
- GraphQL
- Jira
What’s in for you:
● An opportunity to lead a business segment
● Extensive liaising with customers and partners
● A rewarding career progression
Preferred Location:
Delhi NCR
Job Description of Azure Data Factory Engineer
Min relevant exp - 2 years
· Ability to develop ADF ETL pipelines for transforming and ingesting data into the data warehouse.
· Ability to debug data load issues, transformation/translation problems etc.
· Ability to understand data transformation and translation requirements and which tools to leverage to get the job done.
· Strong understanding of various data formats such as CSV, XML, JSON, PARQUET etc.
· Working knowledge of data quality approaches and error handling techniques.
· Good understanding of data modelling for data warehouse and data marts
· Strong verbal and written communication skills
· Ability to learn, contribute and grow in a fast paced environment
· Good to have: Knowledge of CI/CD pipelines, Python/ Spark scripting
Job Responsibilities:
Support, maintain, and enhance existing and new product functionality for trading software in a real-time, multi-threaded, multi-tier server architecture environment to create high and low level design for concurrent high throughput, low latency software architecture.
- Provide software development plans that meet future needs of clients and markets
- Evolve the new software platform and architecture by introducing new components and integrating them with existing ones
- Perform memory, cpu and resource management
- Analyze stack traces, memory profiles and production incident reports from traders and support teams
- Propose fixes, and enhancements to existing trading systems
- Adhere to release and sprint planning with the Quality Assurance Group and Project Management
- Work on a team building new solutions based on requirements and features
- Attend and participate in daily scrum meetings
Required Skills:
- JavaScript and Python
- Multi-threaded browser and server applications
- Amazon Web Services (AWS)
- REST
Essential Duties and Responsibilities:
- Build data systems and pipelines
- Prepare data for ML modeling
- Combine raw information from different sources
- Conduct complex data analysis and report on results
- Build data systems and pipelines.
Work Experience :
- 3 years of experience working with Node, AI/ML & Data Transformation Tools
- Hands on experience with ETL & Data Visualization tools
- Familiarity with Python (Numpy, Pandas)
- Experience with SQL & NoSQL DBs
Must Have : Python, Data warehouse tool , ETL, SQL/MongoDB, Data modeling, Data transformation, Data visualization
Nice to have: MongoDB/ SQL, Snowflake, Matillion, Node.JS, ML model building
Responsibilities:
Responsible for the right tech solutions, maintaining the right balance between short-term and long-term outcomes.
Working with TL / EM to define best practices for development and champion their adoption and at the same time Architect & design technically robust, flexible, and scalable solutions.
Operating with scale and speed amidst flux, there is just a LOT happening.
Perform well in uncertainties; collaborate and work with unclear interfaces to other teams in our fast-paced environment.
Eligibility:
B. Tech/M. Tech in Computer Science from IIT, IIIT, NIT, REC, bits, or other top-tier colleges.
2+ years of total work experience in high scale web-based products, preferably in a startup environment.
Strong hands-on experience in GoLang.
Sound understanding of web technologies (e. g. Javascript, HTML5 CSS) and strong command over databases (MySQL, PostgreSQL, Redis, ElasticSearch).
Knowledge of Microservices architecture patterns
Excellent personal, People & communication skills
Experience in organization-wide initiatives and change management.
About Credgenics:
Credgenics is India’s first of its kind NPA resolution platform backed by credible investors including Accel Partners and Titan Capital. We work with financial institutions, Banks, NBFCs & Digital lending firms to improve the efficiency of their collection using technology, automation intelligence and optimal legal routes in order to facilitate the resolution of stressed assets.
With all major banks and NBFCs as our clients, our SaaS based collections platform helps them efficiently improve their NPA, geographic reach and customer experience.
We count most of India's lending majors as our clients such as ICICI Bank, Axis Bank, Bank of Baroda, etc and have been able to grow 100% MoM consistently even among the pandemic.
Requirements:
- Have 7-10 years of rich experience in Software designing and development in TELECOM.
- Must have experience of building telephony products from scratch to end.
- Must have Knowledge of some telephony such as VOIP, SIP, RTP, TLS, DTLS, SDP, MSRP, Asterix, Freeswitch, Networking etc.
- Proficient in any of the programing language (C, C++, Python, JAVA, Socket programming)
- Proficient in fundamental Software Design Principles, Data Structures, Algorithms, Problem Solving and Complexity Analysis.
- Strong understanding of professional software engineering practises & best practices for the full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations.
- Experience with SQL/NoSQL databases such as Postgres, Mysql, MongoDB is a huge advantage
- Hands-on experience with GIT, CI/CD, REST APIs and Cloud (AWS/ GCP/ Azure) will be preferred
- Understanding of Microservice Architecture and System Design Principles.
- Self-driven, detail-oriented and Strong sense of ownership, urgency, and ability to deliver great work.
- Demonstrated expertise in developing, releasing and maintaining large scale software applications.
Responsibilities:
- Build and lead an outstanding engineering team.
- Make architectural decisions in collaboration with other developers.
- Work in a cross-functional team (Product Managers and Engineers) in a highly collaborative environment where you will also speak to our community and often pair to solve problems.
- Work in a full-stack development environment.
- Coach and mentor team members to encourage continuous growth.
- Implement new features and deploy them on an ongoing basis.
- Design, implement and improve release processes.
- Provide technical support to external teams and developers.
- Work across the whole development lifecycle (discovery, delivery, testing, releasing, supporting, and maintaining) using an agile methodology
- Plan and execute sprints
- Closely collaborate with internal and external experts within various technical domains
- Help the team champion software quality while being pragmatic
- Demonstrate and inspire a strong sense of teamwork
- Demonstrate and inspire integrity, quality, and professionalism
We offer an innovative, fast-paced, and entrepreneurial work environment where you’ll be at the centre of leading change by leveraging technology and creating boundless impact in the FinTech ecosystem.
**THIS IS A 100% WORK FROM OFFICE ROLE**
We are looking for an experienced DevOps engineer that will help our team establish DevOps practice. You will work closely with the technical lead to identify and establish DevOps practices in the company.
You will help us build scalable, efficient cloud infrastructure. You’ll implement monitoring for automated system health checks. Lastly, you’ll build our CI pipeline, and train and guide the team in DevOps practices.
ROLE and RESPONSIBILITIES:
• Understanding customer requirements and project KPIs
• Implementing various development, testing, automation tools, and IT infrastructure
• Planning the team structure, activities, and involvement in project management
activities.
• Managing stakeholders and external interfaces
• Setting up tools and required infrastructure
• Defining and setting development, test, release, update, and support processes for
DevOps operation
• Have the technical skill to review, verify, and validate the software code developed in
the project.
• Troubleshooting techniques and fixing the code bugs
• Monitoring the processes during the entire lifecycle for its adherence and updating or
creating new processes for improvement and minimizing the wastage
• Encouraging and building automated processes wherever possible
• Identifying and deploying cybersecurity measures by continuously performing
vulnerability assessment and risk management
• Incidence management and root cause analysis
• Coordination and communication within the team and with customers
• Selecting and deploying appropriate CI/CD tools
• Strive for continuous improvement and build continuous integration, continuous
development, and constant deployment pipeline (CI/CD Pipeline)
• Mentoring and guiding the team members
• Monitoring and measuring customer experience and KPIs
• Managing periodic reporting on the progress to the management and the customer
Essential Skills and Experience Technical Skills
• Proven 3+years of experience as DevOps
• A bachelor’s degree or higher qualification in computer science
• The ability to code and script in multiple languages and automation frameworks
like Python, C#, Java, Perl, Ruby, SQL Server, NoSQL, and MySQL
• An understanding of the best security practices and automating security testing and
updating in the CI/CD (continuous integration, continuous deployment) pipelines
• An ability to conveniently deploy monitoring and logging infrastructure using tools.
• Proficiency in container frameworks
• Mastery in the use of infrastructure automation toolsets like Terraform, Ansible, and command line interfaces for Microsoft Azure, Amazon AWS, and other cloud platforms
• Certification in Cloud Security
• An understanding of various operating systems
• A strong focus on automation and agile development
• Excellent communication and interpersonal skills
• An ability to work in a fast-paced environment and handle multiple projects
simultaneously
OTHER INFORMATION
The DevOps Engineer will also be expected to demonstrate their commitment:
• to gedu values and regulations, including equal opportunities policy.
• the gedu’s Social, Economic and Environmental responsibilities and minimise environmental impact in the performance of the role and actively contribute to the delivery of gedu’s Environmental Policy.
• to their Health and Safety responsibilities to ensure their contribution to a safe and secure working environment for staff, students, and other visitors to the campus.
About the Company :
Our Client enables enterprises in their digital transformation journey by offering Consulting & Implementation Services related to Data Analytics &Enterprise Performance Management (EPM).
Our Cleint deliver the best-suited solutions to our customers across industries such as Retail & E-commerce, Consumer Goods, Pharmaceuticals & Life Sciences, Real Estate & Senior Housing, Hi-tech, Media & Telecom as Manufacturing and Automotive clientele.
Our in-house research and innovation lab has conceived multiple plug-n-play apps, toolkits and plugins to streamline implementation and faster time-to-market
Job Title– AWS Developer
Notice period- Immediate to 60 days
Experience – 3-8
Location - Noida, Mumbai, Bangalore & Kolkata
Roles & Responsibilities
- Bachelor’s degree in Computer Science or a related analytical field or equivalent experience is preferred
- 3+ years’ experience in one or more architecture domains (e.g., business architecture, solutions architecture, application architecture)
- Must have 2 years of experience in design and implementation of cloud workloads in AWS.
- Minimum of 2 years of experience handling workloads in large-scale environments. Experience in managing large operational cloud environments spanning multiple tenants through techniques such as Multi-Account management, AWS Well Architected Best Practices.
- Minimum 3 years of microservice architectural experience.
- Minimum of 3 years of experience working exclusively designing and implementing cloud-native workloads.
- Experience with analysing and defining technical requirements & design specifications.
- Experience with database design with both relational and document-based database systems.
- Experience with integrating complex multi-tier applications.
- Experience with API design and development.
- Experience with cloud networking and network security, including virtual networks, network security groups, cloud-native firewalls, etc.
- Proven ability to write programs using an object-oriented or functional programming language such as Spark, Python, AWS Glue, Aws Lambda
Job Specification
*Strong and innovative approach to problem solving and finding solutions.
*Excellent communicator (written and verbal, formal and informal).
*Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution.
*Ability to multi-task under pressure and work independently with minimal supervision.
Regards
Team Merito
Role : Full-Time
Reporting to : BI Architect/Program Manager / COE Head
Shift : Normal Day shift with some overlap with onshore team in US
Location: Remote
Experience : 3 - 8 years
Education : BE/BTECH/MCA/MSc. in Computer Science, Business Intelligence, Business
Analytics, BigData
Industry : Enterprise Software Development or Software Product Engineering Services
Primary Skills : Python, SQL, GCP, CDF, CDAP
Secondary Skills: GCP (BigQuery/ Dataproc) / other Cloud DW/ETL - IICS, AWS Glue, AZURE Synapse, Snowflake, Talend
Required Experience
● Must be able to code in Python and SQL on Big Data platforms
● Must have vast experience of implementing DWH/ETL solutions involving multiple data
sources (SAP specifically) and complex transformations for large enterprise customers, preferably Fortune 1000
● Must have 3+ years of experience developing and deploying data pipelines on Cloud native ETL tools eg. CDAP, IIICS, AWS Glue, Azure Synapse, Snowflakes, GCP Composer, etc.
● Must have prior experience of migrating on-premises workloads to GCP CDF or other cloud native services
● Must be proficient in troubleshooting and performance tuning of GCP Services
● Must have executed projects using Agile Scrum methodology and aware of all processes
involved in Scrum
● Good to have experience with cloud deployment of pipelines and orchestration tools
(Airflow, Composer).
● Good to have- Hands on experience/ working knowledge of JDBT, Rest API, Hive, Java
● Should have experience with design of data models which serve multiple applications
underlying the same model (common schemas across multiple scenarios).
● Should have extensive knowledge of large-scale data processing concepts and
technologies.
Non-Technical/ Behavioral competencies required
● Should have very good verbal and written communication, technical articulation,
listening and presentation skills
● Should have proven analytical and problem-solving skills
● Should have demonstrated effective task prioritization, time management and
internal/external stakeholder management skills
● Should be a quick learner, self-starter, go-getter and team player
● Should have experience of working under stringent deadlines in a Matrix organization
structure.
Job Responsibilities
● Consults with Product/Program Manager to identify minimal viable product and
decomposes feature set into small, scoped user stories
● Set Functional product direction for the application development and cascade the
designs to offshore development team
● Design, Deploy, manage, and operate scalable, highly available, and fault tolerant ETL /
BI / Bigdata / Analytics systems on GCP
● Implement and control the flow of data to and from GCP
● Select the appropriate GCP service based on compute, data and security requirements
● Identify appropriate use of GCP operational best practices
● Participate in Daily Scrum calls, Scrum Planning, Retro and Demos meetings
● Comply with development processes, documentation templates and tools prescribed by
CLOUDSUFI and its clients
● Contribute towards the creation of a knowledge repository, reusable assets/solution
accelerators and IPs
● Provide feedback to junior team members and be a coach and mentor for them
● Provide training sessions on the latest technologies and topics to other employees in the
organization
● Participate in organization development activities from time to time - Interviews,
CSR/Employee engagement activities, participation in business events/conferences,
implementation of new policies, systems and procedures as decided by Management
team
Classplus is India's largest B2B ed-tech start-up, enabling 1 Lac+ educators and content creators to create their digital identity with their own branded apps. Starting in 2018, we have grown more than 10x in the last year, into India's fastest-growing video learning platform.
Over the years, marquee investors like Tiger Global, Surge, GSV Ventures, Blume, Falcon, Capital, RTP Global, and Chimera Ventures have supported our vision. Thanks to our awesome and dedicated team, we achieved a major milestone in March this year when we secured a “Series-D” funding.
Now as we go global, we are super excited to have new folks on board who can take the rocketship higher🚀. Do you think you have what it takes to help us achieve this? Find Out Below!
What will you do?
• Define the overall process, which includes building a team for DevOps activities and ensuring that infrastructure changes are reviewed from an architecture and security perspective
• Create standardized tooling and templates for development teams to create CI/CD pipelines
• Ensure infrastructure is created and maintained using terraform
• Work with various stakeholders to design and implement infrastructure changes to support new feature sets in various product lines.
• Maintain transparency and clear visibility of costs associated with various product verticals, environments and work with stakeholders to plan for optimization and implementation
• Spearhead continuous experimenting and innovating initiatives to optimize the infrastructure in terms of uptime, availability, latency and costs
You should apply, if you
1. Are a seasoned Veteran: Have managed infrastructure at scale running web apps, microservices, and data pipelines using tools and languages like JavaScript(NodeJS), Go, Python, Java, Erlang, Elixir, C++ or Ruby (experience in any one of them is enough)
2. Are a Mr. Perfectionist: You have a strong bias for automation and taking the time to think about the right way to solve a problem versus quick fixes or band-aids.
3. Bring your A-Game: Have hands-on experience and ability to design/implement infrastructure with GCP services like Compute, Database, Storage, Load Balancers, API Gateway, Service Mesh, Firewalls, Message Brokers, Monitoring, Logging and experience in setting up backups, patching and DR planning
4. Are up with the times: Have expertise in one or more cloud platforms (Amazon WebServices or Google Cloud Platform or Microsoft Azure), and have experience in creating and managing infrastructure completely through Terraform kind of tool
5. Have it all on your fingertips: Have experience building CI/CD pipeline using Jenkins, Docker for applications majorly running on Kubernetes. Hands-on experience in managing and troubleshooting applications running on K8s
6. Have nailed the data storage game: Good knowledge of Relational and NoSQL databases (MySQL,Mongo, BigQuery, Cassandra…)
7. Bring that extra zing: Have the ability to program/script is and strong fundamentals in Linux and Networking.
8. Know your toys: Have a good understanding of Microservices architecture, Big Data technologies and experience with highly available distributed systems, scaling data store technologies, and creating multi-tenant and self hosted environments, that’s a plus
Being Part of the Clan
At Classplus, you’re not an “employee” but a part of our “Clan”. So, you can forget about being bound by the clock as long as you’re crushing it workwise😎. Add to that some passionate people working with and around you, and what you get is the perfect work vibe you’ve been looking for!
It doesn’t matter how long your journey has been or your position in the hierarchy (we don’t do Sirs and Ma’ams); you’ll be heard, appreciated, and rewarded. One can say, we have a special place in our hearts for the Doers! ✊🏼❤️
Are you a go-getter with the chops to nail what you do? Then this is the place for you.
About us
Classplus is India's largest B2B ed-tech start-up, enabling 1 Lac+ educators and content creators to create their digital identity with their own branded apps. Starting in 2018, we have grown more than 10x in the last year, into India's fastest-growing video learning platform.
Over the years, marquee investors like Tiger Global, Surge, GSV Ventures, Blume, Falcon, Capital, RTP Global, and Chimera Ventures have supported our vision. Thanks to our awesome and dedicated team, we achieved a major milestone in March this year when we secured a “Series-D” funding.
Now as we go global, we are super excited to have new folks on board who can take the rocketship higher🚀. Do you think you have what it takes to help us achieve this? Find Out Below!
What will you do?
· Define the overall process, which includes building a team for DevOps activities and ensuring that infrastructure changes are reviewed from an architecture and security perspective
· Create standardized tooling and templates for development teams to create CI/CD pipelines
· Ensure infrastructure is created and maintained using terraform
· Work with various stakeholders to design and implement infrastructure changes to support new feature sets in various product lines.
· Maintain transparency and clear visibility of costs associated with various product verticals, environments and work with stakeholders to plan for optimization and implementation
· Spearhead continuous experimenting and innovating initiatives to optimize the infrastructure in terms of uptime, availability, latency and costs
You should apply, if you
1. Are a seasoned Veteran: Have managed infrastructure at scale running web apps, microservices, and data pipelines using tools and languages like JavaScript(NodeJS), Go, Python, Java, Erlang, Elixir, C++ or Ruby (experience in any one of them is enough)
2. Are a Mr. Perfectionist: You have a strong bias for automation and taking the time to think about the right way to solve a problem versus quick fixes or band-aids.
3. Bring your A-Game: Have hands-on experience and ability to design/implement infrastructure with GCP services like Compute, Database, Storage, Load Balancers, API Gateway, Service Mesh, Firewalls, Message Brokers, Monitoring, Logging and experience in setting up backups, patching and DR planning
4. Are up with the times: Have expertise in one or more cloud platforms (Amazon WebServices or Google Cloud Platform or Microsoft Azure), and have experience in creating and managing infrastructure completely through Terraform kind of tool
5. Have it all on your fingertips: Have experience building CI/CD pipeline using Jenkins, Docker for applications majorly running on Kubernetes. Hands-on experience in managing and troubleshooting applications running on K8s
6. Have nailed the data storage game: Good knowledge of Relational and NoSQL databases (MySQL,Mongo, BigQuery, Cassandra…)
7. Bring that extra zing: Have the ability to program/script is and strong fundamentals in Linux and Networking.
8. Know your toys: Have a good understanding of Microservices architecture, Big Data technologies and experience with highly available distributed systems, scaling data store technologies, and creating multi-tenant and self hosted environments, that’s a plus
Being Part of the Clan
At Classplus, you’re not an “employee” but a part of our “Clan”. So, you can forget about being bound by the clock as long as you’re crushing it workwise😎. Add to that some passionate people working with and around you, and what you get is the perfect work vibe you’ve been looking for!
It doesn’t matter how long your journey has been or your position in the hierarchy (we don’t do Sirs and Ma’ams); you’ll be heard, appreciated, and rewarded. One can say, we have a special place in our hearts for the Doers! ✊🏼❤️
Are you a go-getter with the chops to nail what you do? Then this is the place for you.
- Lead multiple client projects in the organization.
- Define & build technical architecture for projects.
- Introduce & ensure right coding standards within the team and ensure that it is maintained in all the projects.
- Ensure the quality delivery of projects.
- Ensure applications are confirmed to security guidelines wherever required.
- Assist pre-sales/sales team for converting raw requirements from potential clients to functional solutions.
- Train fellow team members to impose the best practices available.
- Work on improving and managing processes within the team.
- Implement innovative ideas throughout the team to improve the overall efficiency and quality of the team.
- Ensure proper communication & collaboration within the team
Requirements
7+ Years of experience in developing large scale applications.
Solid Domain Knowledge and experience with various Design Patterns & Data Modelling (Associations, OOPs Concepts etc.)
Should have exposure to multiple backend technologies, and databases - both relational and NOSQL databases
Should be aware of latest conventions for APIs
Preferred hands-on experience with GraphQL as well as REST API
Must be well-aware of the latest technological advancements for relevant platforms.
Advanced Concepts of Databases - Views, Stored Procedures, Database Optimization - are a good to have.
Should have Research Oriented Approach
Solid at Logical thinking and Problem solving
Solid understanding of Coding Standards, Code Review processes and delivery of quality products
Experience with various Tools used in Development, Tests & Deployments.
Sound knowledge of DevOps and CI/CD Pipeline Tools
Solid experience with Git Workflow on Enterprise projects and larger teams
Should be good at documentation at project level and code level ; Should have good experience with Agile Methodology and process
Should have a good understanding of server side deployment, scalability, maintainability and handling server security problems.
Should have good understanding on Software UX
Proficient with communication and good at making software architectural judgments
Expected outcomes
- Growing the team and retaining talent, thus, creating an inspiring environment for the team members.
- Creating more leadership within the team along with mentoring and guiding new joiners and experienced developers.
- Creating growth plans for the team and preparing training guides for other team members.
- Refining processes in the team on a regular basis to ensure quality delivery of projects- such as coding standards, project collaboration, code review processes etc.
- Improving overall efficiency and team productivity by introducing new methodologies and ideas in the team.
- Working on R&D and employing innovative technologies in the company.
- Streamlining processes which will result in saving time and cost optimization
- Ensuring code review healthiness and shipping superior quality code
Benefits
- Unlimited learning and growth opportunities
- A collaborative and cheerful work environment
- Exceptional reward and recognition policy
- Outstanding compensation
- Flexible work hours
- Opportunity to make an impact as your work will directly contribute to our business strategy.
At Nickelfox, you have a chance to craft a career path as unique as you are and become the best version of YOU. You will be part of a team with a ‘no limits’ mindset in an inclusive, people-focused culture. And we’re counting on your unique perspective to help Nickelfox grow even faster.
Are you passionate about tech? Dedicated to learning? Come, join us to build an extraordinary experience for yourself and a dignified working world for all.
What makes Nickelfox a great place for you?
In Nickelfox, you’ll join a team whose passion for technology and understanding of business has driven the company to serve clients across 25+ countries in just five years. We partner with our customers to fuel their growth story and enable them to make the right decisions with our customized technology services and insights. All in all, we are passionate to see our customers win the day. This is the reason why 80% of our business comes from repeat clients.
Our mission is to provide dignified employment and an environment that recognizes the uniqueness of every individual and values their expertise, and contribution. We have a culture that encourages everyone to bring their authentic selves to work. Our people enjoy a collaborative work environment with exceptional training and career development. If you like working with a curious, youthful, high-performing team, Nickelfox is the place for you.
About the company:
CogniTensor is an analytical software company that brings data to the heart of decision-making. CogniTensor leverages its product, DeepOptics - an integrated platform to implement 3A (Automation, Analytics and AI) at scale.
Cognitensor has customers ranging in Finance, Energy, Commodity, Retail & Manufacturing. More details can be found on our website: https://www.cognitensor.com/
Our strategic investors include Shell and CIIE.CO (IIM-A/Accenture).
Qualification & Experience:
- BE/B.Tech Degree in Computer Programming, Computer Science, or a related field.
- +2 years experience as a Software Developer.
- Hands on experience in developing finance applications is must
Roles & Responsibilities:
We are looking for a Full Stack Developer to produce scalable software solutions. You’ll be part of a cross-functional team that’s responsible for the full software development life cycle, from conception to deployment.
As a Full Stack Developer, you should be comfortable around both front-end and back-end coding languages, development frameworks and third-party libraries. You should also be a team player with a knack for visual design and utility. Along with familiar with Agile methodologies and testing skills
- Work with development teams and product managers to ideate software solutions
- Design client-side and server-side architecture
- Develop and manage well-functioning databases and applications
- Write effective APIs
- Write technical documentation
- Excellent communication and teamwork skills
Technical Skills:
Must Have
- React JS
- Git / Bitbucket,
- Express JS, Python, HTML, CSS, Node JS
- CI/CD like CircleCI
- Postgres or any DB knowledge
- Familiarity with AWS or Azure or both
Good to Have
- Docker
- Redux
- Android development
- React Native
- Electron
- GraphQL
- Jira
What’s in for you:
- An opportunity to lead a business segment
- Extensive liaising with customers and partners
- A rewarding career progression
Preferred Location:
Delhi NCR
Job Description:
- Creating proxy-based applications
- Migrating legacy applications to new technologies
- Maintaining applications/websites and troubleshooting
Skills & Experience:
- Well-versed with SSO, Shibboleth and SAML
- Should have decent knowledge of networking
- Good with .NET web applications, MVC, RestAPI, REACTJS, NODEJS, HTML, CSS
- Should have previously worked on applications based on SAML, SSO and CDNs.
- Should know Identity Providers, access gateways and integration with SSOs
Responsibilities:
● Writing back-end code and building efficient PHP modules.
● Developing back-end portals with an optimized database.
● Troubleshooting application and code issues.
● Integrating data storage solutions.
● Finalizing back-end features and testing web applications.
Skills Required:
● Solid understanding of Object-Oriented Programming of PHP.
● Hands-on experience in developing REST APIs.
● Worked with Laravel Queues.
● Knowledge of front-end technologies including Vue.js, CSS3, JavaScript
● Familiarity with MySQL databases.
● Able to handle multi-tasks on the project.
● Working knowledge of Agile Development Methodology
● Self-reliant and accept the responsibility of assigned work.
● Experience with some of the following is a huge bonus: AWS, Java, Python, Github.
- Develop backend processes in Django, Django REST Framework that communicate through established REST APIs.
- Build web components and containers that interact with REST APIs.
- Write documentation as new features and functionality are built.
- Write test cases for what you have written.
- Proactively identify bottlenecks and areas for improvement and devise a course-correct plan.
Mandatory Skills :
- Bachelor or Master of Technology in Computer Science, IT, or equivalent course.
- Strong understanding of all key areas of web application architecture.
- Strong knowledge of Data Structures, algorithms, and Design Patterns.
- Proficiency working in Django, Django REST Framework, or Flask.
- Proficient in a modern open-source relational database such as MySQL or Postgres
- Working knowledge of a NoSQL database (MongoDB preferred).
- Hands-on experience with Cloud services such as AWS (EC2, S3, RDS, SQS)
- Hands-on experience with automated unit testing.
- Ability to do database design and modeling.
- Ability to write modular and clean code.
Good to have skills :
- Experience working with Elastic Search or Solr Search.
- Working knowledge of MVC front-end frameworks such as AngularJS or jQuery.
- Working knowledge of LESS / SASS.
- Knowledge of Responsive Web Design (RWD)
• Writing and testing code, debugging programs and integrating applications with third-party
web services. To be successful in this role, you should have experience using server-side logic
and work well in a team.
• Ultimately, you'll build highly responsive web applications that align with our business needs.
• Write effective, scalable code Develop back-end components to improve responsiveness and
overall performance Integrate user-facing elements into applications.
• Test and debug programs
• Improve functionality of existing systems Implement security and data protection solutions
• Assess and prioritize feature requests
• Coordinate with internal teams to understand user requirements and provide technical
solutions
• Expertise in at least one popular Python framework (like Django, Flask, etc)
• Team spirit
• Good problem-solving skills.
Requirements
• 3 to 5 years of experience as a Python Developer
• Hands-on experience with Flask, Django or Gin or Revel or Sanic
• Knowledge of design/architectural patterns will be considered as a plus
• Experience working in an agile development environment with a strong focus on rapid
software development
• Experience in AWS or similar cloud technologies
• Excellent troubleshooting and debugging skills
• Proven ability to complete the assigned task according to the outlined scope and timeline
• Experience with messaging frameworks such as SQS/Kafka/RabbitMq
• Experience with Elastic Search
• Willingness to learn new and different technologies
Location - South Delhi
Responsibilities:
- Participate in the entire application lifecycle, focusing on coding and debugging.
- Write clean code to develop functional web applications.
- Troubleshoot and debug applications.
- Perform UI tests to optimize performance.
- Manage cutting-edge technologies to improve legacy applications.
- Collaborate with Front-end developers to integrate user-facing elements with server side logic.
- Provide training and support to internal teams.
Requirements:
- Experience: 3- 6 years.
- Education Qualification: B. E/ MCA/ B. Tech/ M. Tech/ BCA.
- Skills: Python, Django, Flask, SQL, AWS, Azure, Git.
- Having 3 years of experience in developing backend distributed applications in Python.
- Solid hands-on experience as Python Developer in advanced Python programming.
- Strong knowledge of building RESTful APIs using Python frameworks (e. g. Django, Flask, Web2Py, Tornado).
- Good knowledge on development and accessing web services (REST).
- Should have worked on cloud platforms/services offered by AWS, Azure or GCP (Google Cloud Platform).
- Good understanding of Data Structures and Oops Concepts.
- Understanding of databases and SQL.
- Expertise in developing production ready Micro services in Python.
- Understanding of the threading limitations of Python, and multi-process architecture.
- Proficient understanding of code versioning tools, such as Git.
- Effective oral & written communication skills.
Job Title -Data Scientist
Job Duties
- Data Scientist responsibilities includes planning projects and building analytics models.
- You should have a strong problem-solving ability and a knack for statistical analysis.
- If you're also able to align our data products with our business goals, we'd like to meet you. Your ultimate goal will be to help improve our products and business decisions by making the most out of our data.
Responsibilities
Own end-to-end business problems and metrics, build and implement ML solutions using cutting-edge technology.
Create scalable solutions to business problems using statistical techniques, machine learning, and NLP.
Design, experiment and evaluate highly innovative models for predictive learning
Work closely with software engineering teams to drive real-time model experiments, implementations, and new feature creations
Establish scalable, efficient, and automated processes for large-scale data analysis, model development, deployment, experimentation, and evaluation.
Research and implement novel machine learning and statistical approaches.
Requirements
2-5 years of experience in data science.
In-depth understanding of modern machine learning techniques and their mathematical underpinnings.
Demonstrated ability to build PoCs for complex, ambiguous problems and scale them up.
Strong programming skills (Python, Java)
High proficiency in at least one of the following broad areas: machine learning, statistical modelling/inference, information retrieval, data mining, NLP
Experience with SQL and NoSQL databases
Strong organizational and leadership skills
Excellent communication skills
- Mandatory - Hands on experience in Python and PySpark.
- Build pySpark applications using Spark Dataframes in Python using Jupyter notebook and PyCharm(IDE).
- Worked on optimizing spark jobs that processes huge volumes of data.
- Hands on experience in version control tools like Git.
- Worked on Amazon’s Analytics services like Amazon EMR, Lambda function etc
- Worked on Amazon’s Compute services like Amazon Lambda, Amazon EC2 and Amazon’s Storage service like S3 and few other services like SNS.
- Experience/knowledge of bash/shell scripting will be a plus.
- Experience in working with fixed width, delimited , multi record file formats etc.
- Hands on experience in tools like Jenkins to build, test and deploy the applications
- Awareness of Devops concepts and be able to work in an automated release pipeline environment.
- Excellent debugging skills.
evaluations, clinics and care modules to help Indian parents and children. Our approach
leverages a first-of-its-kind schools-parents-monitoring technologies to evaluate, monitor and
guide children’s health and wellbeing. Based in Delhi NCR and Bengaluru, SKIDS is founded by
serial entrepreneurs with expertise in medical science, machine learning and education
technology. SKIDS is backed by marquee investors in Asia and India. We are actively
expanding our team and looking for passionate and motivated professionals who are keen to
join us in our next phase of growth.
Data Engineer
Location: Gurgaon, India
The Opportunity
We are looking for a data engineer who enjoys solving challenging problems. We are excited
about applicants who are creative, meticulous, and looking to learn broadly in a startup. The
ideal candidate is dedicated to excellence in the workplace, enjoys collaborating with others,
and thrives in a dynamic, fast-paced environment.
You would be expected to:
• Design and implement data pipelines using emerging technologies and tools
• Implement data and compute solutions in cloud platforms such as AWS
• Engineer data storage solutions for large and noisy datasets
• Work with the team to find optimal, scalable solutions for the business
• Collaborate efficiently with other data engineers, scientists, and technicians
You would have:
• a degree in computer science or related technical field
• 2-4 years of experience in a similar role
• proficiency in relevant languages and tech (Python, Linux, Docker etc)
• good working knowledge of AWS based cloud architecture
• familiarity with storage design and best practices
• familiarity with standard security protocols and practices
Our company encourages diversity and is an equal opportunity employer
Skills and requirements
- Experience analyzing complex and varied data in a commercial or academic setting.
- Desire to solve new and complex problems every day.
- Excellent ability to communicate scientific results to both technical and non-technical team members.
Desirable
- A degree in a numerically focused discipline such as, Maths, Physics, Chemistry, Engineering or Biological Sciences..
- Hands on experience on Python, Pyspark, SQL
- Hands on experience on building End to End Data Pipelines.
- Hands on Experience on Azure Data Factory, Azure Data Bricks, Data Lake - added advantage
- Hands on Experience in building data pipelines.
- Experience with Bigdata Tools, Hadoop, Hive, Sqoop, Spark, SparkSQL
- Experience with SQL or NoSQL databases for the purposes of data retrieval and management.
- Experience in data warehousing and business intelligence tools, techniques and technology, as well as experience in diving deep on data analysis or technical issues to come up with effective solutions.
- BS degree in math, statistics, computer science or equivalent technical field.
- Experience in data mining structured and unstructured data (SQL, ETL, data warehouse, Machine Learning etc.) in a business environment with large-scale, complex data sets.
- Proven ability to look at solutions in unconventional ways. Sees opportunities to innovate and can lead the way.
- Willing to learn and work on Data Science, ML, AI.
- Data Engineer
Required skill set: AWS GLUE, AWS LAMBDA, AWS SNS/SQS, AWS ATHENA, SPARK, SNOWFLAKE, PYTHON
Mandatory Requirements
- Experience in AWS Glue
- Experience in Apache Parquet
- Proficient in AWS S3 and data lake
- Knowledge of Snowflake
- Understanding of file-based ingestion best practices.
- Scripting language - Python & pyspark
CORE RESPONSIBILITIES
- Create and manage cloud resources in AWS
- Data ingestion from different data sources which exposes data using different technologies, such as: RDBMS, REST HTTP API, flat files, Streams, and Time series data based on various proprietary systems. Implement data ingestion and processing with the help of Big Data technologies
- Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part of business logic and implement it using the language supported by the base data platform
- Develop automated data quality check to make sure right data enters the platform and verifying the results of the calculations
- Develop an infrastructure to collect, transform, combine and publish/distribute customer data.
- Define process improvement opportunities to optimize data collection, insights and displays.
- Ensure data and results are accessible, scalable, efficient, accurate, complete and flexible
- Identify and interpret trends and patterns from complex data sets
- Construct a framework utilizing data visualization tools and techniques to present consolidated analytical and actionable results to relevant stakeholders.
- Key participant in regular Scrum ceremonies with the agile teams
- Proficient at developing queries, writing reports and presenting findings
- Mentor junior members and bring best industry practices
QUALIFICATIONS
- 5-7+ years’ experience as data engineer in consumer finance or equivalent industry (consumer loans, collections, servicing, optional product, and insurance sales)
- Strong background in math, statistics, computer science, data science or related discipline
- Advanced knowledge one of language: Java, Scala, Python, C#
- Production experience with: HDFS, YARN, Hive, Spark, Kafka, Oozie / Airflow, Amazon Web Services (AWS), Docker / Kubernetes, Snowflake
- Proficient with
- Data mining/programming tools (e.g. SAS, SQL, R, Python)
- Database technologies (e.g. PostgreSQL, Redshift, Snowflake. and Greenplum)
- Data visualization (e.g. Tableau, Looker, MicroStrategy)
- Comfortable learning about and deploying new technologies and tools.
- Organizational skills and the ability to handle multiple projects and priorities simultaneously and meet established deadlines.
- Good written and oral communication skills and ability to present results to non-technical audiences
- Knowledge of business intelligence and analytical tools, technologies and techniques.
Familiarity and experience in the following is a plus:
- AWS certification
- Spark Streaming
- Kafka Streaming / Kafka Connect
- ELK Stack
- Cassandra / MongoDB
- CI/CD: Jenkins, GitLab, Jira, Confluence other related tools
Job Title
Data Analyst
Job Brief
The successful candidate will turn data into information, information into insight and insight into business decisions.
Data Analyst Job Duties
Data analyst responsibilities include conducting full lifecycle analysis to include requirements, activities and design. Data analysts will develop analysis and reporting capabilities. They will also monitor performance and quality control plans to identify improvements.
Responsibilities
● Interpret data, analyze results using statistical techniques and provide ongoing reports.
● Develop and implement databases, data collection systems, data analytics and other strategies that optimize statistical efficiency and quality.
● Acquire data fromprimary orsecondary data sources andmaintain databases/data systems.
● Identify, analyze, and interpret trends orpatternsin complex data sets.
● Filter and “clean” data by reviewing computerreports, printouts, and performance indicatorsto locate and correct code problems.
● Work withmanagementto prioritize business and information needs.
● Locate and define new processimprovement opportunities.
Requirements
● Proven working experienceas aData Analyst or BusinessDataAnalyst.
● Technical expertise regarding data models, database design development, data mining and segmentation techniques.
● Strong knowledge of and experience with reporting packages (Business Objects etc), databases (SQL etc), programming (XML, Javascript, or ETL frameworks).
● Knowledge of statistics and experience using statistical packages for analyzing datasets (Excel, SPSS, SAS etc).
● Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
● Adept atqueries,reportwriting and presenting findings.
Job Location SouthDelhi, New Delhi
About Us
We began in 2015 with an entrepreneurial vision to bring a digital change in the manufacturing landscape of India. With a team of 1500(1k cluster guys) we are working towards the digital transformation of business in the manufacturing industry across domains like Footwear, Apparel, Textile, Accessories etc. We are backed by investors such as Info Edge (Naukri.com ), Matrix Partners, Sequoia, Water Bridge Ventures and select Industry leaders.
Today, we have enabled 4000+ Manufacturers to digitize their distribution channel.
Roles & Responsibilities
Writing and testing code, debugging programs and integrating applications with third-party web services. To be successful in this role, you should have experience using server-side logic and work well in a team.
Ultimately, you'll build highly responsive web applications that align with our business needs. Write effective, scalable code Develop back-end components to improve responsiveness and overall performance Integrate user-facing elements into applications.
Test and debug programs
Improve functionality of existing systems Implement security and data protection solutions Assess and prioritize feature requests
Coordinate with internal teams to understand user requirements and provide technical solutions Expertise in at least one popular Python framework (like Django, Flask, etc)
Team spirit
Good problem-solving skills
Requirements
1 to 5 years of experience as a Python Developer
Hands on experience of Flask, Django or Gin or Revel or Sanic
Knowledge of design/architectural patterns will be considered as a plus
Experience working in an agile development environment with a strong focus on rapid software development
Experience in AWS or similar cloud technologies
Excellent troubleshooting and debugging skills
Proven ability to complete assigned task according to the outlined scope and timeline
Experience with messaging frameworks such as SQS/Kafka/RabbitMq
Experience with Elastic Search
Willingness to learn new and different technologies
• 2 - 5 years of experience building React and/or Mobile Applications
• 5-8 years working with microservices, API servers, databases, cloud-native development, observability,
alerting, and monitoring
• Deep exposure to cloud services, preferably Azure
• Preferably worked in the Finance/Retail domain or other similar domains with complex business
requirements.
• Hands-on skills combined with leadership qualities to guide teams.
Location – Bangalore, Mumbai, Gurgaon
Functional / Technical Skills:
• Strong understanding of networking fundamentals
o OSI Stack, DNS, TCP protocols
o Browser rendering and various stages of execution
• Good understanding of RESTful APIs, GraphQL and Web Sockets
• Ability to debug and profile Web/Mobile applications with Chrome DevTools or Native profilers
• Strong understanding of Distributed Systems, Fault Tolerance and Resiliency. Exposure to setting up
and managing Chaos is a plus.
• Exposure to Domain Driven Design (DDD), SOLID principles, and Data Modelling on various RDBMS,
NoSQL databases.
• Ability to define and document performance goals, SLAs, and volumetrics. Creating a framework for
measuring and validating the goals. Work with teams to implement and meet them.
• Create automation scripts to measure performance. Making this part of the CI/CD process.
• Good understanding of CNCF projects with a specific focus on Observability, Monitoring, Tracing,
Sidecars, Kubernetes
• Tuning of Cloud-native deployments with a focus on Cost Optimization.
• Participate in architecture reviews to identify potential issues, and bottlenecks and provide early guidance.
• Deep knowledge of at least 2 different programming languages and runtimes. Any two of Ruby, Python,
Swift, Go, Rust, C#, Dart, Kotlin, Java, Haskell, OCaml
• Excellent verbal and written communication
• A mindset to constantly learn new things and challenge the Status Quo.
• Experience in the financial domain preferred.
• Hands-on developer in Core Java with an excellent understanding of computer science fundamentals, data structures, algorithms and design patterns.
• Experience with frameworks like Spring, Restful Webservices, Queuing systems, Angular, and Python is highly desired.
• Deep understanding of several cloud providers such as AWS, Azure, Google etc.
• Hands-on experience developing CI/CD pipelines for continuous development and integration.
• Deep knowledge and experience of Java/J2EE and servers like Tomcat.
- Strong CS fundamentals in OOD, DS, Algorithms and Problem Solving for a wide variety of problem spaces and technologies
- Expert coder in any modern language such as Java, Golang or Scala.
- Experience in working with Product and Engineering leaders to drive and implement platform and product vision.
- Ability to leverage deep and wide knowledge of technology stack to recommend appropriate architecture and design solutions, and provide technical leadership to a team of rock star Software Engineers.
- Can translate the impact of design choices on non-functional attributes like scalability, performance, availability and security.
- Experience in providing leadership, career guidance, performance management, prioritization and personnel management for minimum team size of 10.
- Experience creating large-scale, multi-tiered, distributed web applications with databases, and designing web services, APIs, data models and schemas, using SQL or NoSQL.
- Experience with Cloud environments, such as AWS.
- Comfortable in Windows and Linux environments.
- Comfortable with different data storage solutions such as Postgres, Oracle, SQL Server, ElasticSearch, SQL, Hadoop, or MongoDB.
- Experience delivering high quality software following continuous delivery and using code quality tools (JIRA, GitHub, Jenkin, Hudson, Sonar, PMD, Checkstyle, Findbugs, and Fortify etc.)
- Worked in Agile environments previously.
- Good command of development metrics, methodologies and tools.
- Will ensure timely and frequent delivery of high quality software, while adapting technical decisions to the immediate needs of the business.
- Good analytical, problem-solving and troubleshooting skills.
- Can drive adoption of best practices across the Software Engineering Lifecycle, including reviews, source control, build processes, continuous integration and deployment.
- Work collaboratively with product managers to translate requirements into reusable versioned and super fast APIs.
- Partner and work closely with frontend developers for seamless API integration.
- Build a product loved by users by evaluating through the eyes of the customer
- Apply best-practices and coding patterns to ensure solutions are maintainable, modular, of high code-quality and work well across multiple versions.
General requirements :
- Open to learning new concepts and technologies - this is our single biggest and most important requirement. Everything else is optional.
- Good knowledge of version control (preferably Git).
- Ability to choose the right algorithms and data structures.
- Basics of a cloud based platform(GCP or AWS)
- Basics of service-based distributed design.
Our Tech Stack :
We are open to change this mix as we evolve :
- Ruby and Rails
- Postgresql
- ElasticSearch
- Redis + Sidekiq
- Google Cloud Platform
Good to have :
- Experience with Javascript, React and CSS
We are looking out for a Snowflake developer for one of our premium clients for their PAN India loaction
Delivery Manager
Job Description
- Review customer orders and plan and coordinate in the execution of projects and manage the client accounts
- Develop scope and budget for projects
- Ability to understand all technical aspects of the project and its requirements, articulate and communicate the same to internal stakeholders
- Work with the Presales team to define the technical specification for features and functionalities and also determine the effort associated
- Hands-on experience in creating SDD, SRS, Gantt Charts, etc.
- Work closely with Engineering, Solutioning and Platform teams during requirement gathering and documentation phase to understand establish the scope of development work in projects
- Provide suggestions on implementation approach, limitations/complexity around implementation with respect to the platform used, and recommendations for alternative solutions
- Perform resource allocations and workload assignments according to project requirements.
- Report project status to customers and develop required project documentation.
- Serve as primary contact across all the projects being handled and concerns in assigned accounts
Must have skills:
- 8+ years of experience leading and delivering projects to high standards and managing high-value accounts
- Basic understanding of application development technologies like Python, ML, API Integration, etc.
- Good understanding of server/storage configuration, API Integration, Cloud deployment, and configuration
- Should have experience working with large government clients and/or large enterprises in BFSI, eCommerce, Healthcare, Retail, and other such verticals
- Proven track record of building positive and productive working relationships with customers for business growth
- Ability to analyze and troubleshoot issues in a timely fashion
- Ability to identify process improvements to achieve cost-effectiveness and time-saving
- Proven ability to operate with authority and take critical business decisions to meet customer expectations.
- Should have exceptional communication skills (verbal and written) in English
Essential Personal Attributes:
- Must be a strong relationship builder with experience within managing all stakeholders
- Interest in emerging technologies and how they can be applied to drive business outcomes
- Demonstrated commercial and business focus
- Negotiation and influencing skills utilizing a consultative approach
- Ability to multitask and prioritize work to meet timeframes
- Ability to take ownership of tasks as allocated and raise issues or request resources as appropriate
- Ability to communicate technical information to non-technical colleagues and clients.
- Excellent stakeholder management and reporting skills
- Must be able to translate technical environments into business language
- Strong commercial acumen
- Be proficient in server-side development and optimization of data, including database creation and management and debugging
- Integrate data from various back-end services and databases
- Create and maintain software documentation
- Create user-friendly and intuitive interfaces
- Create and analyze reliable and secure back-end functionality
- Maintain, expand, and scale our website
- Remain knowledgeable of emerging technologies/industry trends and apply them into operations and activities
- Collaborate with front-end developers and web designers to match visual design intent
- Bachelor of Engineering/Technology in computer science, software engineering, programming, or equivalent
- Proficiency with languages such as Python, Golang, and Javascript (Node.js, Vue.js)
- Proficiency with MongoDB and MySQL
- Understanding of object-oriented programming
- Experience with the design and implementation of APIs
- Understanding of code versioning and management with Git
- Understanding of code deployment tools such as Jenkins, Capistrano, and ElectricFlow
- Track record of successfully managing multiple company or customer websites
- Excellent time-management and communication skills
We are looking for a Python Web Developer responsible for managing the interchange of data between the server and the users. Your primary focus will be the development of all server-side logic, ensuring high performance and responsiveness to requests from the front end. You will also be responsible for integrating the front-end elements into the application; therefore, understanding front-end technologies is necessary as
well.
Education Qualification: B. E / B. Tech (CSE/IT/ECE) / MCA / BSE Computers from a reputed university and/or equivalent knowledge and skills.
Skillset and Expertise.
1. Having 0-6 Months of Industry Experience.
2. Familiarity/good knowledge of Python with Django framework and ORM (Object Relational
Mapper) concepts.
3. Understanding of the threading limitations of Python, and Multi-process architecture.
4. Understanding the fundamentals of design principles behind a scalable application.
5. Knowledge of unit testing and debugging applications to ensure low latency and high availability.
6. Writing reusable, testable, and efficient code
7. Writing effective and scalable Python codes.
8. ReactJS or AngularJS version 8 and above skills are the added advantage
Skills
Python
Django
Django – ORM
HTML5
CSS
Bootstrap
JavaScript
1. Demonstrable problem solving, analytical skills.
2. Ability to program (at Beginner or Advanced Beginner level in one or more programming
language like Java, C#, Python, JavaScript etc.).
3. Ability to think through a problem at conceptual level and propose solutions.
4. Excellent verbal and written communication skills.
5. Keen desire to learn, stretch and a never-give-up attitude.
6. Very strong interpersonal skills and an easy-going personality.
7. Highly proactive, result-oriented and team player.
8. Ability to stay current with new technology.
Roles and Responsibilities
1. Engineering graduates passing in 2022 starting to kick start their career in technology.
2. Ideal candidate should have hand-on interest in programming and some experience in
writing software applications. If you have a GitHub profile, please provide link to your work.
Nice to have internship / college project.
3. Knowledge of one or more programming language to write fully executable and functional
code is very desirable. Knowledge of databases is also desirable. Both these may be at a
Beginner or Advanced Beginner level.
4. You should be focused, enthusiastic, motivated, and have a desire to progress by learning,
executing and experimenting.
Who Are We
A research-oriented company with expertise in computer vision and artificial intelligence, at its core, Orbo is a comprehensive platform of AI-based visual enhancement stack. This way, companies can find a suitable product as per their need where deep learning powered technology can automatically improve their Imagery.
ORBO's solutions are helping BFSI, beauty and personal care digital transformation and Ecommerce image retouching industries in multiple ways.
WHY US
- Join top AI company
- Grow with your best companions
- Continuous pursuit of excellence, equality, respect
- Competitive compensation and benefits
You'll be a part of the core team and will be working directly with the founders in building and iterating upon the core products that make cameras intelligent and images more informative.
To learn more about how we work, please check out
Description:
We are looking for a computer vision engineer to lead our team in developing a factory floor analytics SaaS product. This would be a fast-paced role and the person will get an opportunity to develop an industrial grade solution from concept to deployment.
Responsibilities:
- Research and develop computer vision solutions for industries (BFSI, Beauty and personal care, E-commerce, Defence etc.)
- Lead a team of ML engineers in developing an industrial AI product from scratch
- Setup end-end Deep Learning pipeline for data ingestion, preparation, model training, validation and deployment
- Tune the models to achieve high accuracy rates and minimum latency
- Deploying developed computer vision models on edge devices after optimization to meet customer requirements
Requirements:
- Bachelor’s degree
- Understanding about depth and breadth of computer vision and deep learning algorithms.
- Experience in taking an AI product from scratch to commercial deployment.
- Experience in Image enhancement, object detection, image segmentation, image classification algorithms
- Experience in deployment with OpenVINO, ONNXruntime and TensorRT
- Experience in deploying computer vision solutions on edge devices such as Intel Movidius and Nvidia Jetson
- Experience with any machine/deep learning frameworks like Tensorflow, and PyTorch.
- Proficient understanding of code versioning tools, such as Git
Our perfect candidate is someone that:
- is proactive and an independent problem solver
- is a constant learner. We are a fast growing start-up. We want you to grow with us!
- is a team player and good communicator
What We Offer:
- You will have fun working with a fast-paced team on a product that can impact the business model of E-commerce and BFSI industries. As the team is small, you will easily be able to see a direct impact of what you build on our customers (Trust us - it is extremely fulfilling!)
- You will be in charge of what you build and be an integral part of the product development process
- Technical and financial growth!
Navgurukul is a collective of changemakers who are passionate about creating a large-scale social impact, particularly, improving the quality of our education system while making it accessible to the marginalised communities.
We train students from low-income & marginalised communities in Software Engineering in our residential programme for students for enabling them to get aspirational jobs, have a voice, and be equipped to bring at least 10 families out of poverty in a financially sustainable model. So far we have placed around 150+ students in various companies including but not limited to Mindtree, Thoughtworks, Unacademy, etc.
You can be a part of this movement through the CIF Fellowship.
To know more about CIF visit us at codeindiafellowship.org and to apply you can fill the form at bit.ly/cifapply.
Candidate should have Max 4+ yrs experience in Python automation testing
With Robot framework and selenium
Less exp couldnt consider.
GCP Data Analyst profile must have below skills sets :
- Knowledge of programming languages like SQL , Oracle, R, MATLAB, Java and Python
- Data cleansing, data visualization, data wrangling
- Data modeling , data warehouse concepts
- Adapt to Big data platform like Hadoop, Spark for stream & batch processing
- GCP (Cloud Dataproc, Cloud Dataflow, Cloud Datalab, Cloud Dataprep, BigQuery, Cloud Datastore, Cloud Datafusion, Auto ML etc)
A.P.T Portfolio, a high frequency trading firm that specialises in Quantitative Trading & Investment Strategies.Founded in November 2009, it has been a major liquidity provider in global Stock markets.
As a manager, you would be incharge of managing the devops team and your remit shall include the following
- Private Cloud - Design & maintain a high performance and reliable network architecture to support HPC applications
- Scheduling Tool - Implement and maintain a HPC scheduling technology like Kubernetes, Hadoop YARN Mesos, HTCondor or Nomad for processing & scheduling analytical jobs. Implement controls which allow analytical jobs to seamlessly utilize ideal capacity on the private cloud.
- Security - Implementing best security practices and implementing data isolation policy between different divisions internally.
- Capacity Sizing - Monitor private cloud usage and share details with different teams. Plan capacity enhancements on a quarterly basis.
- Storage solution - Optimize storage solutions like NetApp, EMC, Quobyte for analytical jobs. Monitor their performance on a daily basis to identify issues early.
- NFS - Implement and optimize latest version of NFS for our use case.
- Public Cloud - Drive AWS/Google-Cloud utilization in the firm for increasing efficiency, improving collaboration and for reducing cost. Maintain the environment for our existing use cases. Further explore potential areas of using public cloud within the firm.
- BackUps - Identify and automate back up of all crucial data/binary/code etc in a secured manner at such duration warranted by the use case. Ensure that recovery from back-up is tested and seamless.
- Access Control - Maintain password less access control and improve security over time. Minimize failures for automated job due to unsuccessful logins.
- Operating System -Plan, test and roll out new operating system for all production, simulation and desktop environments. Work closely with developers to highlight new performance enhancements capabilities of new versions.
- Configuration management -Work closely with DevOps/ development team to freeze configurations/playbook for various teams & internal applications. Deploy and maintain standard tools such as Ansible, Puppet, chef etc for the same.
- Data Storage & Security Planning - Maintain a tight control of root access on various devices. Ensure root access is rolled back as soon the desired objective is achieved.
- Audit access logs on devices. Use third party tools to put in a monitoring mechanism for early detection of any suspicious activity.
- Maintaining all third party tools used for development and collaboration - This shall include maintaining a fault tolerant environment for GIT/Perforce, productivity tools such as Slack/Microsoft team, build tools like Jenkins/Bamboo etc
Qualifications
- Bachelors or Masters Level Degree, preferably in CSE/IT
- 10+ years of relevant experience in sys-admin function
- Must have strong knowledge of IT Infrastructure, Linux, Networking and grid.
- Must have strong grasp of automation & Data management tools.
- Efficient in scripting languages and python
Desirables
- Professional attitude, co-operative and mature approach to work, must be focused, structured and well considered, troubleshooting skills.
- Exhibit a high level of individual initiative and ownership, effectively collaborate with other team members.
APT Portfolio is an equal opportunity employer
Who Are We
A research-oriented company with expertise in computer vision and artificial intelligence, at its core, Orbo is a comprehensive platform of AI-based visual enhancement stack. This way, companies can find a suitable product as per their need where deep learning powered technology can automatically improve their Imagery.
ORBO's solutions are helping BFSI, beauty and personal care digital transformation and Ecommerce image retouching industries in multiple ways.
WHY US
- Join top AI company
- Grow with your best companions
- Continuous pursuit of excellence, equality, respect
- Competitive compensation and benefits
You'll be a part of the core team and will be working directly with the founders in building and iterating upon the core products that make cameras intelligent and images more informative.
To learn more about how we work, please check out
Description:
We are looking for a computer vision engineer to lead our team in developing a factory floor analytics SaaS product. This would be a fast-paced role and the person will get an opportunity to develop an industrial grade solution from concept to deployment.
Responsibilities:
- Research and develop computer vision solutions for industries (BFSI, Beauty and personal care, E-commerce, Defence etc.)
- Lead a team of ML engineers in developing an industrial AI product from scratch
- Setup end-end Deep Learning pipeline for data ingestion, preparation, model training, validation and deployment
- Tune the models to achieve high accuracy rates and minimum latency
- Deploying developed computer vision models on edge devices after optimization to meet customer requirements
Requirements:
- Bachelor’s degree
- Understanding about depth and breadth of computer vision and deep learning algorithms.
- 4+ years of industrial experience in computer vision and/or deep learning
- Experience in taking an AI product from scratch to commercial deployment.
- Experience in Image enhancement, object detection, image segmentation, image classification algorithms
- Experience in deployment with OpenVINO, ONNXruntime and TensorRT
- Experience in deploying computer vision solutions on edge devices such as Intel Movidius and Nvidia Jetson
- Experience with any machine/deep learning frameworks like Tensorflow, and PyTorch.
- Proficient understanding of code versioning tools, such as Git
Our perfect candidate is someone that:
- is proactive and an independent problem solver
- is a constant learner. We are a fast growing start-up. We want you to grow with us!
- is a team player and good communicator
What We Offer:
- You will have fun working with a fast-paced team on a product that can impact the business model of E-commerce and BFSI industries. As the team is small, you will easily be able to see a direct impact of what you build on our customers (Trust us - it is extremely fulfilling!)
- You will be in charge of what you build and be an integral part of the product development process
- Technical and financial growth!