upGrad is an online education platform building the careers of tomorrow by offering the most industry-relevant programs in an immersive learning experience. Our mission is to create a new digital-first learning experience to deliver tangible career impact to individuals at scale. upGrad currently offers programs in Data Science, Machine Learning, Product Management, Digital Marketing, and Entrepreneurship, etc. upGrad is looking for people passionate about management and education to help design learning programs for working professionals to stay sharp and stay relevant and help build the careers of tomorrow.
upGrad was awarded the Best Tech for Education by IAMAI for 2018-19
upGrad was also ranked as one of the LinkedIn Top Startups 2018: The 25 most sought-
after startups in India
upGrad was earlier selected as one of the top ten most innovative companies in India
We were also covered by the Financial Times along with other disruptors in Ed-Tech
upGrad is the official education partner for Government of India - Startup India
Our program with IIIT B has been ranked #1 program in the country in the domain of Artificial Intelligence and Machine Learning
Are you excited by the challenge and the opportunity of applying data-science and data- analytics techniques to the fast developing education technology domain? Do you look forward to, the sense of ownership and achievement that comes with innovating and creating data products from scratch and pushing it live into Production systems? Do you want to work with a team of highly motivated members who are on a mission to empower individuals through education?
If this is you, come join us and become a part of the upGrad technology team. At upGrad the technology team enables all the facets of the business - whether it’s bringing efficiency to ourmarketing and sales initiatives, to enhancing our student learning experience, to empowering our content, delivery and student success teams, to aiding our student’s for their desired careeroutcomes. We play the part of bringing together data & tech to solve these business problems and opportunities at hand.
We are looking for an highly skilled, experienced and passionate data-scientist who can come on-board and help create the next generation of data-powered education tech product. The ideal candidate would be someone who has worked in a Data Science role before wherein he/she is comfortable working with unknowns, evaluating the data and the feasibility of applying scientific techniques to business problems and products, and have a track record of developing and deploying data-science models into live applications. Someone with a strong math, stats, data-science background, comfortable handling data (structured+unstructured) as well as strong engineering know-how to implement/support such data products in Production environment.
Ours is a highly iterative and fast-paced environment, hence being flexible, communicating well and attention-to-detail are very important too. The ideal candidate should be passionate about the customer impact and comfortable working with multiple stakeholders across the company.
Roles & Responsibilities
- 3+ years of experience in analytics, data science, machine learning or comparable role
- Bachelor's degree in Computer Science, Data Science/Data Analytics, Math/Statistics or related discipline
- Experience in building and deploying Machine Learning models in Production systems
- Strong analytical skills: ability to make sense out of a variety of data and its relation/applicability to the business problem or opportunity at hand
- Strong programming skills: comfortable with Python - pandas, numpy, scipy, matplotlib; Databases - SQL and noSQL
- Strong communication skills: ability to both formulate/understand the business problem at hand as well as ability to discuss with non data-science background stakeholders
- Comfortable dealing with ambiguity and competing objectives
Experience in Text Analytics, Natural Language Processing
Advanced degree in Data Science/Data Analytics or Math/Statistics
Comfortable with data-visualization tools and techniques
Knowledge of AWS and Data Warehousing
Passion for building data-products for Production systems - a strong desire to impact
the product through data-science technique
upGrad is an online higher education platform. Founded by Ronnie Screwvala, Mayank Kumar, Ravijot Chugh and Phalgun Kompalli in March’ 2015, upGrad provides rigorous industry-relevant programs designed and delivered in collaboration with world-class faculty and industry. Merging the latest technology, pedagogy, and services, upGrad is creating an immersive learning experience – anytime and anywhere.
Through exclusive partnerships with some of the most prominent universities like IIIT-Bangalore, MICA, BITS Pilani, ISB, Cambridge Judge Business School- our aim to impart university education, online.
Learning online can be tough, especially when you have to do it all by yourself. Reasons why you should upskill with UpGrad:
- We provide an engaging experience via our suite of learning applications right from the university applications till you get a job and transition
- We provide structured online courses in collaboration with some of the prominent universities and industry experts
- We co-create a rigorous curriculum in collaboration with these universities to provide the learners with a holistic learning experience
- All our courses are comprehensive, structured and rigorous - delivered online, providing you the flexibility and opportunity of continuous learning
- We conduct regular live lectures with the industry experts and the professors
- Each of our learners is allocated with a dedicated student mentor who helps them chart a career path and motivates them to push themselves
- We provide in-depth feedback on all the assignments, case studies, and projects
- We have delivered 400+ successful career transitions and we’re committed to building careers of tomorrow
- You get access to an alumni network of 3,000+ students across the globe
- We also conduct periodic offline events like Hackathons, Bootcamps, Alumni Nights and connect you not only to the professors and industry experts but the peers in your batch too
- Last but not the last, we provide career assistance and help all the learners with interview preparations, mentorship calls, and job placements even after the completion of the program
Data driven decision-making is core to advertising technology at AdElement. We are looking for sharp, disciplined, and highly quantitative machine learning/ artificial intellignce engineers with big data experience and a passion for digital marketing to help drive informed decision-making. You will work with top-talent and cutting edge technology and have a unique opportunity to turn your insights into products influencing billions. The potential candidate will have an extensive background in distributed training frameworks, will have experience to deploy related machine learning models end to end, and will have some experience in data-driven decision making of machine learning infrastructure enhancement. This is your chance to leave your legacy and be part of a highly successful and growing company.
- 2+ years of industry experience with Python in a programming intensive role
- 1+ years of experience with one or more of the following machine learning topics: classification, clustering, optimization, recommendation system, graph mining, deep learning
- 2+ years of industry experience with distributed computing frameworks such as Hadoop/Spark, Kubernetes ecosystem, etc
- 2+ years of industry experience with popular deep learning frameworks such as Spark MLlib, Keras, Tensorflow, PyTorch, etc
- 2+ years of industry experience with major cloud computing services
- An effective communicator with the ability to explain technical concepts to a non-technical audience
- (Preferred) Prior experience with ads product development (e.g., DSP/ad-exchange/SSP)
- Collaborate across multiple teams - Data Science, Operations & Engineering on unique machine learning system challenges at scale
- Leverage distributed training systems to build scalable machine learning pipelines including ETL, model training and deployments in Real-Time Bidding space.
- Design and implement solutions to optimize distributed training execution in terms of model hyperparameter optimization, model training/inference latency and system-level bottlenecks
- Research state-of-the-art machine learning infrastructures to improve data healthiness, model quality and state management during the lifecycle of ML models refresh.
- Optimize integration between popular machine learning libraries and cloud ML and data processing frameworks.
- Build Deep Learning models and algorithms with optimal parallelism and performance on CPUs/ GPUs.
- MTech or Ph.D. in Computer Science, Software Engineering, Mathematics or related fields
Technical must haves:
● Extensive exposure to at least one Business Intelligence Platform (if possible, QlikView/Qlik
Sense) – if not Qlik, ETL tool knowledge, ex- Informatica/Talend
● At least 1 Data Query language – SQL/Python
● Experience in creating breakthrough visualizations
● Understanding of RDMS, Data Architecture/Schemas, Data Integrations, Data Models and Data Flows is a must
● A technical degree like BE/B. Tech a must
Technical Ideal to have:
● Exposure to our tech stack – PHP
● Microsoft workflows knowledge
Behavioural Pen Portrait:
● Must Have: Enthusiastic, aggressive, vigorous, high achievement orientation, strong command
over spoken and written English
● Ideal: Ability to Collaborate
Preferred location is Ahmedabad, however, if we find exemplary talent then we are open to remote working model- can be discussed.
Experience with various stream processing and batch processing tools (Kafka,
Spark etc). Programming with Python.
● Experience with relational and non-relational databases.
● Fairly good understanding of AWS (or any equivalent).
● Design new systems and redesign existing systems to work at scale.
● Care about things like fault tolerance, durability, backups and recovery,
performance, maintainability, code simplicity etc.
● Lead a team of software engineers and help create an environment of ownership
● Introduce best practices of software development and ensure their adoption
across the team.
● Help set and maintain coding standards for the team.
The successful candidate will turn data into information, information into insight and insight into business decisions.
Data Analyst Job Duties
Data analyst responsibilities include conducting full lifecycle analysis to include requirements, activities and design. Data analysts will develop analysis and reporting capabilities. They will also monitor performance and quality control plans to identify improvements.
● Interpret data, analyze results using statistical techniques and provide ongoing reports.
● Develop and implement databases, data collection systems, data analytics and other strategies that optimize statistical efficiency and quality.
● Acquire data fromprimary orsecondary data sources andmaintain databases/data systems.
● Identify, analyze, and interpret trends orpatternsin complex data sets.
● Filter and “clean” data by reviewing computerreports, printouts, and performance indicatorsto locate and correct code problems.
● Work withmanagementto prioritize business and information needs.
● Locate and define new processimprovement opportunities.
● Proven working experienceas aData Analyst or BusinessDataAnalyst.
● Technical expertise regarding data models, database design development, data mining and segmentation techniques.
● Knowledge of statistics and experience using statistical packages for analyzing datasets (Excel, SPSS, SAS etc).
● Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
● Adept atqueries,reportwriting and presenting findings.
Job Location SouthDelhi, New Delhi
2 Years - 10 Years
Must Have Skills:
• Good experience in Pyspark - Including Dataframe core functions and Spark SQL
• Good experience in SQL DBs - Be able to write queries including fair complexity.
• Should have excellent experience in Big Data programming for data transformation and aggregations
• Good at ELT architecture. Business rules processing and data extraction from Data Lake into data streams for business consumption.
• Good customer communication.
• Good Analytical skills
|Degree / Diploma||Bachelor of Engineering, Bachelor of Computer Applications, Any Engineering|
|Specialization / Subject||Any Specialisation|
|Job Type||Full Time|
Deep-Rooted.Co is on a mission to get Fresh, Clean, Community (Local farmer) produce from harvest to reach your home with a promise of quality first! Our values are rooted in trust, convenience, and dependability, with a bunch of learning & fun thrown in.
Founded out of Bangalore by Arvind, Avinash, Guru and Santosh, with the support of our Investors Accel, Omnivore & Mayfield, we raised $7.5 million in Seed, Series A and Debt funding till date from investors include ACCEL, Omnivore, Mayfield among others. Our brand Deep-Rooted.Co which was launched in August 2020 was the first of its kind as India’s Fruits & Vegetables (F&V) which is present in Bangalore & Hyderabad and on a journey of expansion to newer cities which will be managed seamlessly through Tech platform that has been designed and built to transform the Agri-Tech sector.
Deep-Rooted.Co is committed to building a diverse and inclusive workplace and is an equal-opportunity employer.
How is this possible? It’s because we work with smart people. We are looking for Engineers in Bangalore to work with the Product Leader (Founder) and CTO and this is a meaningful project for us and we are sure you will love the project as it touches everyday life and is fun. This will be a virtual consultation.
We want to start the conversation about the project we have for you, but before that, we want to connect with you to know what’s on your mind. Do drop a note sharing your mobile number and letting us know when we can catch up.
Purpose of the role:
* As a startup we have data distributed all across various sources like Excel, Google Sheets, Databases etc. We need swift decision making based a on a lot of data that exists as we grow. You help us bring together all this data and put it in a data model that can be used in business decision making. * Handle nuances of Excel and Google Sheets API. * Pull data in and manage it growth, freshness and correctness. * Transform data in a format that aids easy decision-making for Product, Marketing and Business Heads. * Understand the business problem, solve the same using the technology and take it to production - no hand offs - full path to production is yours.
* Good Knowledge And Experience with Programming languages - Java, SQL,Python. * Good Knowledge of Data Warehousing, Data Architecture. * Experience with Data Transformations and ETL; * Experience with API tools and more closed systems like Excel, Google Sheets etc. * Experience AWS Cloud Platform and Lambda * Experience with distributed data processing tools. * Experiences with container-based deployments on cloud.
Java, SQL, Python, Data Build Tool, Lambda, HTTP, Rest API, Extract Transform Load.
We celebrate diversity, embrace a data-driven culture, and deeply encourage professional development through classes, certifications, and conferences. The reciprocity of sharing knowledge and growth with each other, our clients, and partners is a foundation we live by. Employees at Shyftlabs enjoy unlimited paid time off, 11 paid holidays, comprehensive health, vision, and dental benefits, and profit-sharing.
- Design, implement and operate stable, scalable, low-cost solutions to flow data from production systems into the data lake and into end-user-facing applications.
- Design automated processes for in-depth analysis databases.
- Design automated data control processes.
- Collaborate with the software development team to build and test the designed solutions.
- Learn, publish, analyze and improve management information dashboards, operational business metrics decks, and key performance indicators.
- Improve tools, and processes, scale existing solutions, and create new solutions as required based on stakeholder needs.
- Provide in-depth analysis to management with the support of accounting, finance, and transportation teams.
- Perform monthly variance analysis and identify risks & opportunities.
- 3+ years of experience as a Data Engineer or in a similar role
- Experience with data modeling, data warehousing, and building ETL pipelines
- Experience in SQL
- Degree in Computer Science, Engineering, Mathematics, or a related field and 4+ years of industry experience
- Graduate degree in Computer Science, Engineering or related technical field
- Knowledge of professional software engineering practices & best practices for the full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations
- Proficiency with at least one Object Oriented language (e.g. Java, Python, Ruby)
- Strong customer focus, ownership, urgency, and drive.
- Excellent communication skills and the ability to work well in a team.
- Effective analytical, troubleshooting, and problem-solving skills.
- Experience building data products incrementally and integrating and managing datasets from multiple sources
- Experience with AWS Tools and Technologies (Redshift, S3, EC2, Glue)
- Expertise with Data modeling skills, Advanced SQL with Oracle, MySQL, and Columnar Databases
- Experience with Snowflake
DATA SCIENTIST-MACHINE LEARNING
GormalOne LLP. Mumbai IN
GormalOne is a social impact Agri tech enterprise focused on farmer-centric projects. Our vision is to make farming highly profitable for the smallest farmer, thereby ensuring India's “Nutrition security”. Our mission is driven by the use of advanced technology. Our technology will be highly user-friendly, for the majority of farmers, who are digitally naive. We are looking for people, who are keen to use their skills to transform farmers' lives. You will join a highly energized and competent team that is working on advanced global technologies such as OCR, facial recognition, and AI-led disease prediction amongst others.
GormalOne is looking for a machine learning engineer to join. This collaborative yet dynamic, role is suited for candidates who enjoy the challenge of building, testing, and deploying end-to-end ML pipelines and incorporating ML Ops best practices across different technology stacks supporting a variety of use cases. We seek candidates who are curious not only about furthering their own knowledge of ML Ops best practices through hands-on experience but can simultaneously help uplift the knowledge of their colleagues.
Roles & Responsibilities
- Individual contributor
- Developing and maintaining an end-to-end data science project
- Deploying scalable applications on different platform
- Ability to analyze and enhance the efficiency of existing products
What are we looking for?
- 3 to 5 Years of experience as a Data Scientist
- Skilled in Data Analysis, EDA, Model Building, and Analysis.
- Basic coding skills in Python
- Decent knowledge of Statistics
- Creating pipelines for ETL and ML models.
- Experience in the operationalization of ML models
- Good exposure to Deep Learning, ANN, DNN, CNN, RNN, and LSTM.
- Hands-on experience in Keras, PyTorch or Tensorflow
- Tech/BE in Computer Science or Information Technology
- Certification in AI, ML, or Data Science is preferred.
- Master/Ph.D. in a relevant field is preferred.
- Exp in tools and packages like Tensorflow, MLFlow, Airflow
- Exp in object detection techniques like YOLO
- Exposure to cloud technologies
- Operationalization of ML models
- Good understanding and exposure to MLOps
Kindly note: Salary shall be commensurate with qualifications and experience
YOU'LL BE OUR : Data Scientist YOU'LL BE BASED AT: IBC Knowledge Park, Bangalore
YOU'LL BE ALIGNED WITH :Engineering Manager
YOU'LL BE A MEMBER OF : Data Intelligence
WHAT YOU'LL DO AT ATHER:
Work with the vehicle intelligence platform to evolve the algorithms and the platform enhancing ride experience.
Provide data driven solutions from simple to fairly complex insights on the data collected from the vehicle
Identify measures and metrics that could be used insightfully to make decisions across firmware components and productionize these.
Support the data science lead and manager and partner in fairly intensive projects around diagnostics, predictive modeling, BI and Engineering data sciences.
Build and automate scripts that could be re-used efficiently.
Build interactive reports/dashboards that could be re-used across engineering teams for their discussions/ explorations iteratively
Support monitoring, measuring the success of algorithms and features build and lead innovation through objective reasoning and thinking Engage with the data science lead and the engineering team stakeholders on the solution approach and draft a plan of action.
Contribute to product/team roadmap by generating and implementing innovative data and analysis based ideas as product features
Handhold/Guide team in successful conceptualization and implementation of key product differentiators through effective benchmarking.
HERE'S WHAT WE ARE LOOKING FOR :
• Good understanding of C++, Golang programming skills and system architecture understanding
• Experience with IOT, telemetry will be a plus
• Proficient in R markdown/ Python/ Grafana
• Proficient in SQL and No-SQL
• Proficient in R / Python programming
• Good understanding of ML techniques/ Sparks ML
YOU BRING TO ATHER:
• B.E/B.Tech preferably in Computer Science
• 3 to 5 yrs of work experience as Data Scientist
Required Experience: 5 - 7 Years
Skills : ADF, Azure, SSIS, python
Azure Data Engineer with hands on SSIS migrations and ADF expertise.
Roles & Responsibilities
•Overall, 6+ years’ experience in Cloud Data Engineering, with hands on experience in ADF (Azure Data Factory) is required.
Hands on experience with SSIS to ADF migration is preferred.
SQL Server Integration Services (SSIS) workloads to SSIS in ADF. ( Must have done at least one migration)
Hands on experience implementing Azure Data Factory frameworks, scheduling, and performance tuning.
Hands on experience in migrating SSIS solutions to ADF
Hands on experience in ADF coding side.
Hands on experience with MPP Database architecture
Hands on experience in python