27+ Big data Jobs in Mumbai | Big data Job openings in Mumbai
Apply to 27+ Big data Jobs in Mumbai on CutShort.io. Explore the latest Big data Job opportunities across top companies like Google, Amazon & Adobe.
Experience: 12-15 Years
Key Responsibilities:
- Client Engagement & Requirements Gathering: Independently engage with client stakeholders to
- understand data landscapes and requirements, translating them into functional and technical specifications.
- Data Architecture & Solution Design: Architect and implement Hadoop-based Cloudera CDP solutions,
- including data integration, data warehousing, and data lakes.
- Data Processes & Governance: Develop data ingestion and ETL/ELT frameworks, ensuring robust data governance and quality practices.
- Performance Optimization: Provide SQL expertise and optimize Hadoop ecosystems (HDFS, Ozone, Kudu, Spark Streaming, etc.) for maximum performance.
- Coding & Development: Hands-on coding in relevant technologies and frameworks, ensuring project deliverables meet stringent quality and performance standards.
- API & Database Management: Integrate APIs and manage databases (e.g., PostgreSQL, Oracle) to support seamless data flows.
- Leadership & Mentoring: Guide and mentor a team of data engineers and analysts, fostering collaboration and technical excellence.
Skills Required:
- a. Technical Proficiency:
- • Extensive experience with Hadoop ecosystem tools and services (HDFS, YARN, Cloudera
- Manager, Impala, Kudu, Hive, Spark Streaming, etc.).
- • Proficiency in programming languages like Spark, Python, Scala and a strong grasp of SQL
- performance tuning.
- • ETL tool expertise (e.g., Informatica, Talend, Apache Nifi) and data modelling knowledge.
- • API integration skills for effective data flow management.
- b. Project Management & Communication:
- • Proven ability to lead large-scale data projects and manage project timelines.
- • Excellent communication, presentation, and critical thinking skills.
- c. Client & Team Leadership:
- • Engage effectively with clients and partners, leading onsite and offshore teams.
- Architectural Leadership:
- Design and architect robust, scalable, and high-performance Hadoop solutions.
- Define and implement data architecture strategies, standards, and processes.
- Collaborate with senior leadership to align data strategies with business goals.
- Technical Expertise:
- Develop and maintain complex data processing systems using Hadoop and its ecosystem (HDFS, YARN, MapReduce, Hive, HBase, Pig, etc.).
- Ensure optimal performance and scalability of Hadoop clusters.
- Oversee the integration of Hadoop solutions with existing data systems and third-party applications.
- Strategic Planning:
- Develop long-term plans for data architecture, considering emerging technologies and future trends.
- Evaluate and recommend new technologies and tools to enhance the Hadoop ecosystem.
- Lead the adoption of big data best practices and methodologies.
- Team Leadership and Collaboration:
- Mentor and guide data engineers and developers, fostering a culture of continuous improvement.
- Work closely with data scientists, analysts, and other stakeholders to understand requirements and deliver high-quality solutions.
- Ensure effective communication and collaboration across all teams involved in data projects.
- Project Management:
- Lead large-scale data projects from inception to completion, ensuring timely delivery and high quality.
- Manage project resources, budgets, and timelines effectively.
- Monitor project progress and address any issues or risks promptly.
- Data Governance and Security:
- Implement robust data governance policies and procedures to ensure data quality and compliance.
- Ensure data security and privacy by implementing appropriate measures and controls.
- Conduct regular audits and reviews of data systems to ensure compliance with industry standards and regulations.
Minimum of 8 years of experience of which, 4 years should be of applied data mining
experience in disciplines such as Call Centre Metrics.
Strong experience in advanced statistics and analytics including segmentation, modelling, regression, forecasting etc.
Experience with leading and managing large teams.
Demonstrated pattern of success in using advanced quantitative analytic methods to solve business problems.
Demonstrated experience with Business Intelligence/Data Mining tools to work with
data, investigate anomalies, construct data sets, and build models.
Critical to share details on projects undertaken (preferably on telecom industry)
specifically through analysis from CRM.
Role: Principal Software Engineer
We looking for a passionate Principle Engineer - Analytics to build data products that extract valuable business insights for efficiency and customer experience. This role will require managing, processing and analyzing large amounts of raw information and in scalable databases. This will also involve developing unique data structures and writing algorithms for the entirely new set of products. The candidate will be required to have critical thinking and problem-solving skills. The candidates must be experienced with software development with advanced algorithms and must be able to handle large volume of data. Exposure with statistics and machine learning algorithms is a big plus. The candidate should have some exposure to cloud environment, continuous integration and agile scrum processes.
Responsibilities:
• Lead projects both as a principal investigator and project manager, responsible for meeting project requirements on schedule
• Software Development that creates data driven intelligence in the products which deals with Big Data backends
• Exploratory analysis of the data to be able to come up with efficient data structures and algorithms for given requirements
• The system may or may not involve machine learning models and pipelines but will require advanced algorithm development
• Managing, data in large scale data stores (such as NoSQL DBs, time series DBs, Geospatial DBs etc.)
• Creating metrics and evaluation of algorithm for better accuracy and recall
• Ensuring efficient access and usage of data through the means of indexing, clustering etc.
• Collaborate with engineering and product development teams.
Requirements:
• Master’s or Bachelor’s degree in Engineering in one of these domains - Computer Science, Information Technology, Information Systems, or related field from top-tier school
• OR Master’s degree or higher in Statistics, Mathematics, with hands on background in software development.
• Experience of 8 to 10 year with product development, having done algorithmic work
• 5+ years of experience working with large data sets or do large scale quantitative analysis
• Understanding of SaaS based products and services.
• Strong algorithmic problem-solving skills
• Able to mentor and manage team and take responsibilities of team deadline.
Skill set required:
• In depth Knowledge Python programming languages
• Understanding of software architecture and software design
• Must have fully managed a project with a team
• Having worked with Agile project management practices
• Experience with data processing analytics and visualization tools in Python (such as pandas, matplotlib, Scipy, etc.)
• Strong understanding of SQL and querying to NoSQL database (eg. Mongo, Casandra, Redis
Roles and
Responsibilities
Seeking AWS Cloud Engineer /Data Warehouse Developer for our Data CoE team to
help us in configure and develop new AWS environments for our Enterprise Data Lake,
migrate the on-premise traditional workloads to cloud. Must have a sound
understanding of BI best practices, relational structures, dimensional data modelling,
structured query language (SQL) skills, data warehouse and reporting techniques.
Extensive experience in providing AWS Cloud solutions to various business
use cases.
Creating star schema data models, performing ETLs and validating results with
business representatives
Supporting implemented BI solutions by: monitoring and tuning queries and
data loads, addressing user questions concerning data integrity, monitoring
performance and communicating functional and technical issues.
Job Description: -
This position is responsible for the successful delivery of business intelligence
information to the entire organization and is experienced in BI development and
implementations, data architecture and data warehousing.
Requisite Qualification
Essential
-
AWS Certified Database Specialty or -
AWS Certified Data Analytics
Preferred
Any other Data Engineer Certification
Requisite Experience
Essential 4 -7 yrs of experience
Preferred 2+ yrs of experience in ETL & data pipelines
Skills Required
Special Skills Required
AWS: S3, DMS, Redshift, EC2, VPC, Lambda, Delta Lake, CloudWatch etc.
Bigdata: Databricks, Spark, Glue and Athena
Expertise in Lake Formation, Python programming, Spark, Shell scripting
Minimum Bachelor’s degree with 5+ years of experience in designing, building,
and maintaining AWS data components
3+ years of experience in data component configuration, related roles and
access setup
Expertise in Python programming
Knowledge in all aspects of DevOps (source control, continuous integration,
deployments, etc.)
Comfortable working with DevOps: Jenkins, Bitbucket, CI/CD
Hands on ETL development experience, preferably using or SSIS
SQL Server experience required
Strong analytical skills to solve and model complex business requirements
Sound understanding of BI Best Practices/Methodologies, relational structures,
dimensional data modelling, structured query language (SQL) skills, data
warehouse and reporting techniques
Preferred Skills
Required
Experience working in the SCRUM Environment.
Experience in Administration (Windows/Unix/Network/
plus.
Experience in SQL Server, SSIS, SSAS, SSRS
Comfortable with creating data models and visualization using Power BI
Hands on experience in relational and multi-dimensional data modelling,
including multiple source systems from databases and flat files, and the use of
standard data modelling tools
Ability to collaborate on a team with infrastructure, BI report development and
business analyst resources, and clearly communicate solutions to both
technical and non-technical team members
Job Title -Senior Java Developers
Job Description - Backend Engineer - Lead (Java)
Mumbai, India | Engineering Team | Full-time
Are you passionate enough to be a crucial part of a highly analytical and scalable user engagement platform?
Are you ready learn new technologies and willing to step out of your comfort zone to explore and learn new skills?
If so, this is an opportunity for you to join a high-functioning team and make your mark on our organisation!
The Impact you will create:
- Build campaign generation services which can send app notifications at a speed of 10 million a minute
- Dashboards to show Real time key performance indicators to clients
- Develop complex user segmentation engines which creates segments on Terabytes of data within few seconds
- Building highly available & horizontally scalable platform services for ever growing data
- Use cloud based services like AWS Lambda for blazing fast throughput & auto scalability
- Work on complex analytics on terabytes of data like building Cohorts, Funnels, User path analysis, Recency Frequency & Monetary analysis at blazing speed
- You will build backend services and APIs to create scalable engineering systems.
- As an individual contributor, you will tackle some of our broadest technical challenges that requires deep technical knowledge, hands-on software development and seamless collaboration with all functions.
- You will envision and develop features that are highly reliable and fault tolerant to deliver a superior customer experience.
- Collaborating various highly-functional teams in the company to meet deliverables throughout the software development lifecycle.
- Identify and improvise areas of improvement through data insights and research.
What we look for?
- 5-9 years of experience in backend development and must have worked on Java/shell/Perl/python scripting.
- Solid understanding of engineering best practices, continuous integration, and incremental delivery.
- Strong analytical skills, debugging and troubleshooting skills, product line analysis.
- Follower of agile methodology (Sprint planning, working on JIRA, retrospective etc).
- Proficiency in usage of tools like Docker, Maven, Jenkins and knowledge on frameworks in Java like spring, spring boot, hibernate, JPA.
- Ability to design application modules using various concepts like object oriented, multi-threading, synchronization, caching, fault tolerance, sockets, various IPCs, database interfaces etc.
- Hands on experience on Redis, MySQL and streaming technologies like Kafka producer consumers and NoSQL databases like mongo dB/Cassandra.
- Knowledge about versioning like Git and deployment processes like CICD.
What’s in it for you?
- Immense growth, continuous learning and deliver the best to the top-notch brands
- Work with some of the most innovative brains
- Opportunity to explore your entrepreneurial mind-set
- Open culture where your creative bug gets activated.
If this sounds like a company you would like to be a part of, and a role you would thrive in, please don’t hold back from applying! We need your unique perspective for our continued innovation and success!
So let’s converse! Our inquisitive nature is all keen to know more about you.
Skills
JAVA, MONGO, Redis, Cassandra, Kafka, rabbitMQ
Role / Purpose - Lead Developer - API and Microservices
Must have a strong hands-on development track record building integration utilizing a variety of integration products, tools, protocols, technologies, and patterns.
- Must have an in-depth understanding of SOA/EAI/ESB concepts, SOA Governance, Event-Driven Architecture, message-based architectures, file sharing, and exchange platforms, data virtualization and caching strategies, J2EE design patterns, frameworks
- Should possess experience with at least one of middleware technologies (Application Servers, BPMS, BRMS, ESB & Message Brokers), Programming languages (e.g. Java/J2EE, JavaScript, COBOL, C), Operating Systems (e.g. Windows, Linux, MVS), and Databases (DB2, MySQL, No SQL Databases like MongoDB, Cassandra, Hadoop, etc.)
- Must have experience implementing API Service architectures (SOAP, REST) using any of the market-leading API Management tools such as Apigee and frameworks such as Spring Boot for Microservices
- Should have Advanced skills in implementing API Service architectures (SOAP, REST) using any of the market-leading API Management tools such as Apigee or similar frameworks such as Spring Boot for Microservices
- Appetite to manage large-scale projects and multiple tracks
- Experience and knowhow of the e-commerce domain and retail experience are preferred
- Good communication & people managerial skills
• Project Planning and Management
o Take end-to-end ownership of multiple projects / project tracks
o Create and maintain project plans and other related documentation for project
objectives, scope, schedule and delivery milestones
o Lead and participate across all the phases of software engineering, right from
requirements gathering to GO LIVE
o Lead internal team meetings on solution architecture, effort estimation, manpower
planning and resource (software/hardware/licensing) planning
o Manage RIDA (Risks, Impediments, Dependencies, Assumptions) for projects by
developing effective mitigation plans
• Team Management
o Act as the Scrum Master
o Conduct SCRUM ceremonies like Sprint Planning, Daily Standup, Sprint Retrospective
o Set clear objectives for the project and roles/responsibilities for each team member
o Train and mentor the team on their job responsibilities and SCRUM principles
o Make the team accountable for their tasks and help the team in achieving them
o Identify the requirements and come up with a plan for Skill Development for all team
members
• Communication
o Be the Single Point of Contact for the client in terms of day-to-day communication
o Periodically communicate project status to all the stakeholders (internal/external)
• Process Management and Improvement
o Create and document processes across all disciplines of software engineering
o Identify gaps and continuously improve processes within the team
o Encourage team members to contribute towards process improvement
o Develop a culture of quality and efficiency within the team
Must have:
• Minimum 08 years of experience (hands-on as well as leadership) in software / data engineering
across multiple job functions like Business Analysis, Development, Solutioning, QA, DevOps and
Project Management
• Hands-on as well as leadership experience in Big Data Engineering projects
• Experience developing or managing cloud solutions using Azure or other cloud provider
• Demonstrable knowledge on Hadoop, Hive, Spark, NoSQL DBs, SQL, Data Warehousing, ETL/ELT,
DevOps tools
• Strong project management and communication skills
• Strong analytical and problem-solving skills
• Strong systems level critical thinking skills
• Strong collaboration and influencing skills
Good to have:
• Knowledge on PySpark, Azure Data Factory, Azure Data Lake Storage, Synapse Dedicated SQL
Pool, Databricks, PowerBI, Machine Learning, Cloud Infrastructure
• Background in BFSI with focus on core banking
• Willingness to travel
Work Environment
• Customer Office (Mumbai) / Remote Work
Education
• UG: B. Tech - Computers / B. E. – Computers / BCA / B.Sc. Computer Science
Responsibilities
- Manage and drive a team of Data Analysts and Sr. Data Analysts to provide logistics and supply chain solutions.
- Conduct meetings with Clients to gather the requirements and understand the scope.
- Conduct meetings with internal stake holders to walk them through the solution and handover the analysis.
- Define business problems, identify solutions, provide analysis and insights from the client's data.
- 5 Conduct scheduled progress reviews on all projects and interact with onsite team daily.
- Ensure solutions are delivered error free and submitted on time.
- Implement ETL processes using Pentaho Data Integration (Pentaho ETL)Design and implement data models in Hadoop.
- Provide end-user training and technical assistance to maximize utilization of tools.
- Deliver technical guidance to team, including hands-on development as necessary; oversee standards, change controls and documentation library for training and reuse.
Requirements
- Bachelor's degree in Engineering.
- 16+ years of experience in Supply Chain and logistics or related industry and Analytics experience.
- 3 years of experience in team handling(8+People) and interacting with the executive leadership teams.
- Strong project and time management skills with ability to multitask and prioritize workload.
- Solid expertise with MS Excel, SQL, any visualization tools like Tableau/Power BI, any ETL tools.
- Proficiency in Hadoop / Hive.
- Experience Pentaho ETL, Pentaho Visualization API, Tableau.
- Hands on experience of working with Big data sets (Data sets with millions of records).
- Strong technical and Management experience.
Desired Skills and Experience
- NET,ASP.NET
Synergetic IT Services India Pvt Ltd
2. Responsible for gathering system requirements working together with application architects
and owners
3. Responsible for generating scripts and templates required for the automatic provisioning of
resources
4. Discover standard cloud services offerings, install, and execute processes and standards for
optimal use of cloud service provider offerings
5. Incident Management on IaaS, PaaS, SaaS.
6. Responsible for debugging technical issues inside a complex stack involving virtualization,
containers, microservices, etc.
7. Collaborate with the engineering teams to enable their applications to run
on Cloud infrastructure.
8. Experience with OpenStack, Linux, Amazon Web Services, Microsoft Azure, DevOps, NoSQL
etc will be plus.
9. Design, implement, configure, and maintain various Azure IaaS, PaaS, SaaS services.
10. Deploy and maintain Azure IaaS Virtual Machines and Azure Application and Networking
Services.
11. Optimize Azure billing for cost/performance (VM optimization, reserved instances, etc.)
12. Implement, and fully document IT projects.
13. Identify improvements to IT documentation, network architecture, processes/procedures,
and tickets.
14. Research products and new technologies to increase efficiency of business and operations
15. Keep all tickets and projects updated and track time in a detailed format
16. Should be able to multi-task and work across a range of projects and issues with various
timelines and priorities
Technical:
• Minimum 1 year experience Azure and knowledge on Office365 services preferred.
• Formal education in IT preferred
• Experience with Managed Service business model a major plus
• Bachelor’s degree preferred
We are looking for a Director of Engineering to lead one of our key product engineering teams. This role will report directly to the VP of Engineering and will be responsible for successful execution of the company's business mission through development of cutting-edge software products and solutions.
- As an owner of the product you will be required to plan and execute the product road map and provide technical leadership to the engineering team.
- You will have to collaborate with Product Management and Implementation teams and build a commercially successful product.
- You will be responsible to recruit & lead a team of highly skilled software engineers and provide strong hands on engineering leadership.
- Requirement deep technical knowledge in Software Product Engineering using Java/J2EE, Node.js, React.js, fullstack, NosqlDB, mongodb, cassandra, neo4j, elastic search, kibana, elk, kafka, redis, docker, kubernetes, apache, solr, activemq, rabbitmq, spark, scala, sqoop, hbase, hive, websocket, webcrawler, springboot, etc. is a must
Requirements
16+ years of experience in Software Engineering with at least 5+ years as an engineering leader in a software product company.
- Hands-on technical leadership with proven ability to recruit high performance talent
- High technical credibility - ability to audit technical decisions and push for the best solution to a problem.
- Experience building E2E Application right from backend database to persistent layer.
- Experience UI technologies Angular, react.js, Node.js or fullstack environment will be preferred.
- Experience with NoSQL technologies (MongoDB, Cassandra, Neo4j, Dynamodb, etc.)
- Elastic Search, Kibana, ELK, Logstash.
- Experience in developing Enterprise Software using Agile Methodology.
- Good understanding of Kafka, Redis, ActiveMQ, RabbitMQ, Solr etc.
- SaaS cloud-based platform exposure.
- Experience on Docker, Kubernetes etc.
- Ownership E2E design development and also quality enterprise product/application deliverable exposure
- A track record of setting and achieving high standards
- Strong understanding of modern technology architecture
- Key Programming Skills: Java, J2EE with cutting edge technologies
- Excellent team building, mentoring and coaching skills are a must-have
Benefits
Five Reasons Why You Should Join Zycus
- Cloud Product Company: We are a Cloud SaaS Company and our products are created by using the latest technologies like ML and AI. Our UI is in Angular JS and we are developing our mobile apps using React.
- A Market Leader: Zycus is recognized by Gartner (world’s leading market research analyst) as a Leader in Procurement Software Suites.
- Move between Roles: We believe that change leads to growth and therefore we allow our employees to shift careers and move to different roles and functions within the organization
- Get a Global Exposure: You get to work and deal with our global customers.
- Create an Impact: Zycus gives you the environment to create an impact on the product and transform your ideas into reality. Even our junior engineers get the opportunity to work on different product features.
About Us
Zycus is a pioneer in Cognitive Procurement software and has been a trusted partner of choice for large global enterprises for two decades. Zycus has been consistently recognized by Gartner, Forrester, and other analysts for its Source to Pay integrated suite. Zycus powers its S2P software with the revolutionary Merlin AI Suite. Merlin AI takes over the tactical tasks and empowers procurement and AP officers to focus on strategic projects; offers data-driven actionable insights for quicker and smarter decisions, and its conversational AI offers a B2C type user-experience to the end-users.
Zycus helps enterprises drive real savings, reduce risks, and boost compliance, and its seamless, intuitive, and easy-to-use user interface ensures high adoption and value across the organization.
Start your #CognitiveProcurement journey with us, as you are #MeantforMore.
Click here to Apply :
https://apply.workable.com/zycus-1/j/D926111745/">Director of Engineering - Zycus (workable.com) - Mumbai.
https://apply.workable.com/zycus-1/j/90665BFD4C/">Director of Engineering - Zycus (workable.com) - Bengaluru.
https://apply.workable.com/zycus-1/j/3A5FBA2C7C/">Director of Engineering - Zycus (workable.com) - Pune.
Responsibilities:
- Must be able to write quality code and build secure, highly available systems.
- Assemble large, complex datasets that meet functional / non-functional business requirements.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing datadelivery, re-designing infrastructure for greater scalability, etc with the guidance.
- Create datatools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
- Monitoring performance and advising any necessary infrastructure changes.
- Defining dataretention policies.
- Implementing the ETL process and optimal data pipeline architecture
- Build analytics tools that utilize the datapipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.
- Create design documents that describe the functionality, capacity, architecture, and process.
- Develop, test, and implement datasolutions based on finalized design documents.
- Work with dataand analytics experts to strive for greater functionality in our data
- Proactively identify potential production issues and recommend and implement solutions
Skillsets:
- Good understanding of optimal extraction, transformation, and loading of datafrom a wide variety of data sources using SQL and AWS ‘big data’ technologies.
- Proficient understanding of distributed computing principles
- Experience in working with batch processing/ real-time systems using various open-source technologies like NoSQL, Spark, Pig, Hive, Apache Airflow.
- Implemented complex projects dealing with the considerable datasize (PB).
- Optimization techniques (performance, scalability, monitoring, etc.)
- Experience with integration of datafrom multiple data sources
- Experience with NoSQL databases, such as HBase, Cassandra, MongoDB, etc.,
- Knowledge of various ETL techniques and frameworks, such as Flume
- Experience with various messaging systems, such as Kafka or RabbitMQ
- Good understanding of Lambda Architecture, along with its advantages and drawbacks
- Creation of DAGs for dataengineering
- Expert at Python /Scala programming, especially for dataengineering/ ETL purposes
- Build campaign generation services which can send app notifications at a speed of 10 million a minute
- Dashboards to show Real time key performance indicators to clients
- Develop complex user segmentation engines which creates segments on Terabytes of data within few seconds
- Building highly available & horizontally scalable platform services for ever growing data
- Use cloud based services like AWS Lambda for blazing fast throughput & auto scalability
- Work on complex analytics on terabytes of data like building Cohorts, Funnels, User path analysis, Recency Frequency & Monetary analysis at blazing speed
- You will build backend services and APIs to create scalable engineering systems.
- As an individual contributor, you will tackle some of our broadest technical challenges that requires deep technical knowledge, hands-on software development and seamless collaboration with all functions.
- You will envision and develop features that are highly reliable and fault tolerant to deliver a superior customer experience.
- Collaborating various highly-functional teams in the company to meet deliverables throughout the software development lifecycle.
- Identify and improvise areas of improvement through data insights and research.
- 2-5 years of experience in backend development and must have worked on Java/shell/Perl/python scripting.
- Solid understanding of engineering best practices, continuous integration, and incremental delivery.
- Strong analytical skills, debugging and troubleshooting skills, product line analysis.
- Follower of agile methodology (Sprint planning, working on JIRA, retrospective etc).
- Proficiency in usage of tools like Docker, Maven, Jenkins and knowledge on frameworks in Java like spring, spring boot, hibernate, JPA.
- Ability to design application modules using various concepts like object oriented, multi-threading, synchronization, caching, fault tolerance, sockets, various IPCs, database interfaces etc.
- Hands on experience on Redis, MySQL and streaming technologies like Kafka producer consumers and NoSQL databases like mongo dB/Cassandra.
- Knowledge about versioning like Git and deployment processes like CICD.
upGrad is an online education platform building the careers of tomorrow by offering the most industry-relevant programs in an immersive learning experience. Our mission is to create a new digital-first learning experience to deliver tangible career impact to individuals at scale. upGrad currently offers programs in Data Science, Machine Learning, Product Management, Digital Marketing, and Entrepreneurship, etc. upGrad is looking for people passionate about management and education to help design learning programs for working professionals to stay sharp and stay relevant and help build the careers of tomorrow.
- upGrad was awarded the Best Tech for Education by IAMAI for 2018-19,
- upGrad was also ranked as one of the LinkedIn Top Startups 2018: The 25 most sought-after startups in India.
- upGrad was earlier selected as one of the top ten most innovative companies in India by FastCompany.
- We were also covered by the Financial Times along with other disruptors in Ed-Tech.
- upGrad is the official education partner for Government of India - Startup India program.
- Our program with IIIT B has been ranked #1 program in the country in the domain of Artificial Intelligence and Machine Learning.
About the Role
A highly motivated individual who has expe rience in architecting end to end web based ecommerce/online/SaaS products and systems; bringing them to production quickly and with high quality. Able to understand expected business results and map architecture to drive business forward. Passionate about building world class solutions.
Role and Responsibilities
- Work with Product Managers and Business to understand business/product requirements and vision.
- Provide a clear architectural vision in line with business and product vision.
- Lead a team of architects, developers, and data engineers to provide platform services to other engineering teams.
- Provide architectural oversight to engineering teams across the organization.
- Hands on design and development of platform services and features owned by self - this is a hands-on coding role.
- Define guidelines for best practices covering design, unit testing, secure coding etc.
- Ensure quality by reviewing design, code, test plans, load test plans etc. as appropriate.
- Work closely with the QA and Support teams to track quality and proactively identify improvement opportunities.
- Work closely with DevOps and IT to ensure highly secure and cost optimized operations in the cloud.
- Grow technical skills in the team - identify skill gaps with plans to address them, participate in hiring, mentor other architects and engineers.
- Support other engineers in resolving complex technical issues as a go-to person.
Skills/Experience
- 12+ years of experience in design and development of ecommerce scale systems and highly scalable SaaS or enterprise products.
- Extensive experience in developing extensible and scalable web applications with
- Java, Spring Boot, Go
- Web Services - REST, OAuth, OData
- Database/Caching - MySQL, Cassandra, MongoDB, Memcached/Redis
- Queue/Broker services - RabbitMQ/Kafka
- Microservices architecture via Docker on AWS or Azure.
- Experience with web front end technologies - HTML5, CSS3, JavaScript libraries and frameworks such as jQuery, AngularJS, React, Vue.js, Bootstrap etc.
- Extensive experience with cloud based architectures and how to optimize design for cost.
- Expert level understanding of secure application design practices and a working understanding of cloud infrastructure security.
- Experience with CI/CD processes and design for testability.
- Experience working with big data technologies such as Spark/Storm/Hadoop/Data Lake Architectures is a big plus.
- Action and result-oriented problem-solver who works well both independently and as part of a team; able to foster and develop others' ideas as well as his/her own.
- Ability to organize, prioritize and schedule a high workload and multiple parallel projects efficiently.
- Excellent verbal and written communication with stakeholders in a matrixed environment.
- Long term experience with at least one product from inception to completion and evolution of the product over multiple years.
B.Tech/MCA (IT/Computer Science) from a premier institution (IIT/NIT/BITS) and/or a US Master's degree in Computer Science.
at Sportz Interactive
Job Role : Associate Manager (Database Development)
Key Responsibilities:
- Optimizing performances of many stored procedures, SQL queries to deliver big amounts of data under a few seconds.
- Designing and developing numerous complex queries, views, functions, and stored procedures
- to work seamlessly with the Application/Development team’s data needs.
- Responsible for providing solutions to all data related needs to support existing and new
- applications.
- Creating scalable structures to cater to large user bases and manage high workloads
- Responsible in every step from the beginning stages of the projects from requirement gathering to implementation and maintenance.
- Developing custom stored procedures and packages to support new enhancement needs.
- Working with multiple teams to design, develop and deliver early warning systems.
- Reviewing query performance and optimizing code
- Writing queries used for front-end applications
- Designing and coding database tables to store the application data
- Data modelling to visualize database structure
- Working with application developers to create optimized queries
- Maintaining database performance by troubleshooting problems.
- Accomplishing platform upgrades and improvements by supervising system programming.
- Securing database by developing policies, procedures, and controls.
- Designing and managing deep statistical systems.
Desired Skills and Experience :
- 7+ years of experience in database development
- Minimum 4+ years of experience in PostgreSQL is a must
- Experience and in-depth knowledge in PL/SQL
- Ability to come up with multiple possible ways of solving a problem and deciding on the most optimal approach for implementation that suits the work case the most
- Have knowledge of Database Administration and have the ability and experience of using the CLI tools for administration
- Experience in Big Data technologies is an added advantage
- Secondary platforms: MS SQL 2005/2008, Oracle, MySQL
- Ability to take ownership of tasks and flexibility to work individually or in team
- Ability to communicate with teams and clients across time zones and global regions
- Good communication and self-motivated
- Should have the ability to work under pressure
- Knowledge of NoSQL and Cloud Architecture will be an advantage
Key Responsibilities
-
Lead and inspire high-performing engineering teams to build and scale highly reliable, secure and responsive architecture for all LoveLocal products
-
Build highly Scalable platforms using Python, ReactJS, NodeJS, Kafka, Distributed Caching, SQL and No SQL Databases
-
Take full ownership of the people related aspects of the Engineering team engagement, recruiting, retention, compensation, promotions, individual performance management and organizational structure
-
Should be actively involved in hunting and recruiting new talent
-
Manage the engineering pipeline at all levels from planning the product implementation with the developers, distributing and managing work of the engineering team and ensure code quality across products and services
-
Should continuously plan and build for increasing scale of the product and ensure full hygiene of the tech infrastructure and code sanity
-
Should be the primary point of contact of functional heads for all tech dependencies
-
Take ownership of overall tech budget, reviewing expenses and planning for future costs
-
Work closely with CTO and Product Head on the Tech roadmap of the organisation
-
Become an integral part of the core team and be strongly aligned with the mission and values of the company
Main Qualifications
-
8+ years of experience in Software Development with end-end implementation of products (Preferably in a product driven organisation)
-
Minimum 3 years of experience leading a team of at least 15 developers, either as an EM or a Tech Lead
-
Should have worked in a well scaled and fast paced working environment (Startups beyond Series B or a major tech organisation)
-
Comfortable handling multiple competing priorities in a fast-paced environment.
-
Well versed with cloud computing and infrastructure management AWS / GCP
-
Has worked extensively on designing and implementing Microservices architecture
-
Should have experience working with NoSQL data stores such as MongoDb, Cassandra, HBase, DynamoDB, etc.
-
Should have experience working with Big Data streaming services such as Kinesis, Kafka, RabbitMQ etc
-
Should be highly skilled in Python and have solid exposure to JavaScript
-
Should have solid exposure to Web development and have had familiarity working with Android and IOS engineers
-
Excellent understanding of software engineering practices and Design Patterns
-
A Bachelors or Masters in Computer Science or equivalent engineering degree from top engineering institutes.
Job requirements
- A strong engineer with excellent Ruby experience working with Ruby on Rails
- Experience with Node.js
- Experience with SQL/nosql databases(Postgresql, cassandra, MongoDB)
- Experience with REST services and API design
- Experience with building the system for scale
- Experience with version control systems (bitbucket, git etc.)
- Experience working with AWS
- Experience with docker/microservices will be an added advantage
- Knowledge of unit & integration testing
- Knowledge of agile development process, jira
- Strong knowledge of algorithms and Data structures
- Basic understanding of the HTTP protocol
- Demonstrated experience working on application development projects and test-driven development. Experience in writing high quality code
- Knowledge of blockchain technology, smart contracts and cryptocurrency will be an added advantage
- Experience in fintech domain will be another added advantage
- Bachelor’s degree in computer programming, computer science, or a related field.
- Fluency or understanding of specific languages, such as Java, PHP, or Python, and operating systems may be required.
- 3+ years of experience with Ruby On Rails.
- Strong Project & Time Management Skills, along with the ability to apply these skills while working independently, or as part of a team.
Your mission is to help lead team towards creating solutions that improve the way our business is run. Your knowledge of design, development, coding, testing and application programming will help your team raise their game, meeting your standards, as well as satisfying both business and functional requirements. Your expertise in various technology domains will be counted on to set strategic direction and solve complex and mission critical problems, internally and externally. Your quest to embracing leading-edge technologies and methodologies inspires your team to follow suit.
Responsibilities and Duties :
- As a Data Engineer you will be responsible for the development of data pipelines for numerous applications handling all kinds of data like structured, semi-structured &
unstructured. Having big data knowledge specially in Spark & Hive is highly preferred.
- Work in team and provide proactive technical oversight, advice development teams fostering re-use, design for scale, stability, and operational efficiency of data/analytical solutions
Education level :
- Bachelor's degree in Computer Science or equivalent
Experience :
- Minimum 3+ years relevant experience working on production grade projects experience in hands on, end to end software development
- Expertise in application, data and infrastructure architecture disciplines
- Expert designing data integrations using ETL and other data integration patterns
- Advanced knowledge of architecture, design and business processes
Proficiency in :
- Modern programming languages like Java, Python, Scala
- Big Data technologies Hadoop, Spark, HIVE, Kafka
- Writing decently optimized SQL queries
- Orchestration and deployment tools like Airflow & Jenkins for CI/CD (Optional)
- Responsible for design and development of integration solutions with Hadoop/HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions
- Knowledge of system development lifecycle methodologies, such as waterfall and AGILE.
- An understanding of data architecture and modeling practices and concepts including entity-relationship diagrams, normalization, abstraction, denormalization, dimensional
modeling, and Meta data modeling practices.
- Experience generating physical data models and the associated DDL from logical data models.
- Experience developing data models for operational, transactional, and operational reporting, including the development of or interfacing with data analysis, data mapping,
and data rationalization artifacts.
- Experience enforcing data modeling standards and procedures.
- Knowledge of web technologies, application programming languages, OLTP/OLAP technologies, data strategy disciplines, relational databases, data warehouse development and Big Data solutions.
- Ability to work collaboratively in teams and develop meaningful relationships to achieve common goals
Skills :
Must Know :
- Core big-data concepts
- Spark - PySpark/Scala
- Data integration tool like Pentaho, Nifi, SSIS, etc (at least 1)
- Handling of various file formats
- Cloud platform - AWS/Azure/GCP
- Orchestration tool - Airflow
Your mission is to help lead team towards creating solutions that improve the way our business is run. Your knowledge of design, development, coding, testing and application programming will help your team raise their game, meeting your standards, as well as satisfying both business and functional requirements. Your expertise in various technology domains will be counted on to set strategic direction and solve complex and mission critical problems, internally and externally. Your quest to embracing leading-edge technologies and methodologies inspires your team to follow suit.
Responsibilities and Duties :
- As a Data Engineer you will be responsible for the development of data pipelines for numerous applications handling all kinds of data like structured, semi-structured &
unstructured. Having big data knowledge specially in Spark & Hive is highly preferred.
- Work in team and provide proactive technical oversight, advice development teams fostering re-use, design for scale, stability, and operational efficiency of data/analytical solutions
Education level :
- Bachelor's degree in Computer Science or equivalent
Experience :
- Minimum 5+ years relevant experience working on production grade projects experience in hands on, end to end software development
- Expertise in application, data and infrastructure architecture disciplines
- Expert designing data integrations using ETL and other data integration patterns
- Advanced knowledge of architecture, design and business processes
Proficiency in :
- Modern programming languages like Java, Python, Scala
- Big Data technologies Hadoop, Spark, HIVE, Kafka
- Writing decently optimized SQL queries
- Orchestration and deployment tools like Airflow & Jenkins for CI/CD (Optional)
- Responsible for design and development of integration solutions with Hadoop/HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions
- Knowledge of system development lifecycle methodologies, such as waterfall and AGILE.
- An understanding of data architecture and modeling practices and concepts including entity-relationship diagrams, normalization, abstraction, denormalization, dimensional
modeling, and Meta data modeling practices.
- Experience generating physical data models and the associated DDL from logical data models.
- Experience developing data models for operational, transactional, and operational reporting, including the development of or interfacing with data analysis, data mapping,
and data rationalization artifacts.
- Experience enforcing data modeling standards and procedures.
- Knowledge of web technologies, application programming languages, OLTP/OLAP technologies, data strategy disciplines, relational databases, data warehouse development and Big Data solutions.
- Ability to work collaboratively in teams and develop meaningful relationships to achieve common goals
Skills :
Must Know :
- Core big-data concepts
- Spark - PySpark/Scala
- Data integration tool like Pentaho, Nifi, SSIS, etc (at least 1)
- Handling of various file formats
- Cloud platform - AWS/Azure/GCP
- Orchestration tool - Airflow
Company is into Product Development.
What's the role?
Your role as a Principal Engineer will involve working with various team. As a principal engineer, will need full knowledge of the software development lifecycle and Agile methodologies. You will demonstrate multi-tasking skills under tight deadlines and constraints. You will regularly contribute to the development of work products (including analyzing, designing, programming, debugging, and documenting software) and may work with customers to resolve challenges and respond to suggestions for improvements and enhancements. You will setup the standard and principal for the product he/she drives.
- Setup coding practice, guidelines & quality of the software delivered.
- Determines operational feasibility by evaluating analysis, problem definition, requirements, solution development, and proposed solutions.
- Documents and demonstrates solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code.
- Prepares and installs solutions by determining and designing system specifications, standards, and programming.
- Improves operations by conducting systems analysis; recommending changes in policies and procedures.
- Updates job knowledge by studying state-of-the-art development tools, programming techniques, and computing equipment; participating in educational opportunities; reading professional publications; maintaining personal networks; participating in professional organizations.
- Protects operations by keeping information confidential.
- Develops software solutions by studying information needs; conferring with users; studying systems flow, data usage, and work processes; investigating problem areas; following the software development lifecycle. Who are you? You are a go-getter, with an eye for detail, strong problem-solving and debugging skills, and having a degree in BE/MCA/M.E./ M Tech degree or equivalent degree from reputed college/university.
Essential Skills / Experience:
- 10+ years of engineering experience
- Experience in designing and developing high volume web-services using API protocols and data formats
- Proficient in API modelling languages and annotation
- Proficient in Java programming
- Experience with Scala programming
- Experience with ETL systems
- Experience with Agile methodologies
- Experience with Cloud service & storage
- Proficient in Unix/Linux operating systems
- Excellent oral and written communication skills Preferred:
- Functional programming languages (Scala, etc)
- Scripting languages (bash, Perl, Python, etc)
- Amazon Web Services (Redshift, ECS etc)