Cutshort logo
Big data Jobs in Mumbai

27+ Big data Jobs in Mumbai | Big data Job openings in Mumbai

Apply to 27+ Big data Jobs in Mumbai on CutShort.io. Explore the latest Big data Job opportunities across top companies like Google, Amazon & Adobe.

icon
This opening is with an MNC
Mumbai, Malad, andheri
8 - 13 yrs
₹13L - ₹22L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+8 more

Minimum of 8 years of experience of which, 4 years should be of applied data mining

experience in disciplines such as Call Centre Metrics.

 Strong experience in advanced statistics and analytics including segmentation, modelling, regression, forecasting etc.

 Experience with leading and managing large teams.

 Demonstrated pattern of success in using advanced quantitative analytic methods to solve business problems.

 Demonstrated experience with Business Intelligence/Data Mining tools to work with

data, investigate anomalies, construct data sets, and build models.

 Critical to share details on projects undertaken (preferably on telecom industry)

specifically through analysis from CRM.

Read more
Mumbai, Navi Mumbai
6 - 14 yrs
₹16L - ₹37L / yr
Python
PySpark
Data engineering
Big Data
Hadoop
+3 more

Role: Principal Software Engineer


We looking for a passionate Principle Engineer - Analytics to build data products that extract valuable business insights for efficiency and customer experience. This role will require managing, processing and analyzing large amounts of raw information and in scalable databases. This will also involve developing unique data structures and writing algorithms for the entirely new set of products. The candidate will be required to have critical thinking and problem-solving skills. The candidates must be experienced with software development with advanced algorithms and must be able to handle large volume of data. Exposure with statistics and machine learning algorithms is a big plus. The candidate should have some exposure to cloud environment, continuous integration and agile scrum processes.



Responsibilities:


• Lead projects both as a principal investigator and project manager, responsible for meeting project requirements on schedule

• Software Development that creates data driven intelligence in the products which deals with Big Data backends

• Exploratory analysis of the data to be able to come up with efficient data structures and algorithms for given requirements

• The system may or may not involve machine learning models and pipelines but will require advanced algorithm development

• Managing, data in large scale data stores (such as NoSQL DBs, time series DBs, Geospatial DBs etc.)

• Creating metrics and evaluation of algorithm for better accuracy and recall

• Ensuring efficient access and usage of data through the means of indexing, clustering etc.

• Collaborate with engineering and product development teams.


Requirements:


• Master’s or Bachelor’s degree in Engineering in one of these domains - Computer Science, Information Technology, Information Systems, or related field from top-tier school

• OR Master’s degree or higher in Statistics, Mathematics, with hands on background in software development.

• Experience of 8 to 10 year with product development, having done algorithmic work

• 5+ years of experience working with large data sets or do large scale quantitative analysis

• Understanding of SaaS based products and services.

• Strong algorithmic problem-solving skills

• Able to mentor and manage team and take responsibilities of team deadline.


Skill set required:


• In depth Knowledge Python programming languages

• Understanding of software architecture and software design

• Must have fully managed a project with a team

• Having worked with Agile project management practices

• Experience with data processing analytics and visualization tools in Python (such as pandas, matplotlib, Scipy, etc.)

• Strong understanding of SQL and querying to NoSQL database (eg. Mongo, Casandra, Redis

Read more
Encubate Tech Private Ltd
Mumbai
5 - 6 yrs
₹15L - ₹20L / yr
Amazon Web Services (AWS)
Amazon Redshift
Data modeling
ITL
Agile/Scrum
+7 more

Roles and

Responsibilities

Seeking AWS Cloud Engineer /Data Warehouse Developer for our Data CoE team to

help us in configure and develop new AWS environments for our Enterprise Data Lake,

migrate the on-premise traditional workloads to cloud. Must have a sound

understanding of BI best practices, relational structures, dimensional data modelling,

structured query language (SQL) skills, data warehouse and reporting techniques.

 Extensive experience in providing AWS Cloud solutions to various business

use cases.

 Creating star schema data models, performing ETLs and validating results with

business representatives

 Supporting implemented BI solutions by: monitoring and tuning queries and

data loads, addressing user questions concerning data integrity, monitoring

performance and communicating functional and technical issues.

Job Description: -

This position is responsible for the successful delivery of business intelligence

information to the entire organization and is experienced in BI development and

implementations, data architecture and data warehousing.

Requisite Qualification

Essential

-

AWS Certified Database Specialty or -

AWS Certified Data Analytics

Preferred

Any other Data Engineer Certification

Requisite Experience

Essential 4 -7 yrs of experience

Preferred 2+ yrs of experience in ETL & data pipelines

Skills Required

Special Skills Required

 AWS: S3, DMS, Redshift, EC2, VPC, Lambda, Delta Lake, CloudWatch etc.

 Bigdata: Databricks, Spark, Glue and Athena

 Expertise in Lake Formation, Python programming, Spark, Shell scripting

 Minimum Bachelor’s degree with 5+ years of experience in designing, building,

and maintaining AWS data components

 3+ years of experience in data component configuration, related roles and

access setup

 Expertise in Python programming

 Knowledge in all aspects of DevOps (source control, continuous integration,

deployments, etc.)

 Comfortable working with DevOps: Jenkins, Bitbucket, CI/CD

 Hands on ETL development experience, preferably using or SSIS

 SQL Server experience required

 Strong analytical skills to solve and model complex business requirements

 Sound understanding of BI Best Practices/Methodologies, relational structures,

dimensional data modelling, structured query language (SQL) skills, data

warehouse and reporting techniques

Preferred Skills

Required

 Experience working in the SCRUM Environment.

 Experience in Administration (Windows/Unix/Network/Database/Hadoop) is a

plus.

 Experience in SQL Server, SSIS, SSAS, SSRS

 Comfortable with creating data models and visualization using Power BI

 Hands on experience in relational and multi-dimensional data modelling,

including multiple source systems from databases and flat files, and the use of

standard data modelling tools

 Ability to collaborate on a team with infrastructure, BI report development and

business analyst resources, and clearly communicate solutions to both

technical and non-technical team members

Read more
Netcore Cloud
Mumbai, Navi Mumbai, Bengaluru (Bangalore), Pune
5 - 9 yrs
₹10L - ₹35L / yr
Java
Spring Boot
Apache Kafka
RabbitMQ
Cassandra
+3 more

Job Title -Senior Java Developers

Job Description - Backend Engineer - Lead (Java)

Mumbai, India | Engineering Team | Full-time

 

Are you passionate enough to be a crucial part of a highly analytical and scalable user engagement platform?

Are you ready learn new technologies and willing to step out of your comfort zone to explore and learn new skills?

 

If so, this is an opportunity for you to join a high-functioning team and make your mark on our organisation!

 

The Impact you will create:

  • Build campaign generation services which can send app notifications at a speed of 10 million a minute
  • Dashboards to show Real time key performance indicators to clients
  • Develop complex user segmentation engines which creates segments on Terabytes of data within few seconds
  • Building highly available & horizontally scalable platform services for ever growing data
  • Use cloud based services like AWS Lambda for blazing fast throughput & auto scalability
  • Work on complex analytics on terabytes of data like building Cohorts, Funnels, User path analysis, Recency Frequency & Monetary analysis at blazing speed
  • You will build backend services and APIs to create scalable engineering systems.
  • As an individual contributor, you will tackle some of our broadest technical challenges that requires deep technical knowledge, hands-on software development and seamless collaboration with all functions.
  • You will envision and develop features that are highly reliable and fault tolerant to deliver a superior customer experience.
  • Collaborating various highly-functional teams in the company to meet deliverables throughout the software development lifecycle.
  • Identify and improvise areas of improvement through data insights and research.

 

What we look for?

  • 5-9 years of experience in backend development and must have worked on Java/shell/Perl/python scripting.
  • Solid understanding of engineering best practices, continuous integration, and incremental delivery.
  • Strong analytical skills, debugging and troubleshooting skills, product line analysis.
  • Follower of agile methodology (Sprint planning, working on JIRA, retrospective etc).
  • Proficiency in usage of tools like Docker, Maven, Jenkins and knowledge on frameworks in Java like spring, spring boot, hibernate, JPA.
  • Ability to design application modules using various concepts like object oriented, multi-threading, synchronization, caching, fault tolerance, sockets, various IPCs, database interfaces etc.
  • Hands on experience on Redis, MySQL and streaming technologies like Kafka producer consumers and NoSQL databases like mongo dB/Cassandra.
  • Knowledge about versioning like Git and deployment processes like CICD.

What’s in it for you?

 

  • Immense growth, continuous learning and deliver the best to the top-notch brands
  • Work with some of the most innovative brains
  • Opportunity to explore your entrepreneurial mind-set
  • Open culture where your creative bug gets activated.

 

If this sounds like a company you would like to be a part of, and a role you would thrive in, please don’t hold back from applying! We need your unique perspective for our continued innovation and success!

So let’s converse! Our inquisitive nature is all keen to know more about you.

Skills

JAVA, MONGO, Redis, Cassandra, Kafka, rabbitMQ


 

Read more
Tata Digital Pvt Ltd
Agency job
via Seven N Half by Priya Singh
Mumbai, Mangalore, Gurugram
5 - 11 yrs
₹1L - ₹15L / yr
SOA
EAI
ESB
J2EE
RESTful APIs
+14 more

Role / Purpose - Lead Developer - API and Microservices

Must have a strong hands-on development track record building integration utilizing a variety of integration products, tools, protocols, technologies, and patterns.

  • Must have an in-depth understanding of SOA/EAI/ESB concepts, SOA Governance, Event-Driven Architecture, message-based architectures, file sharing, and exchange platforms, data virtualization and caching strategies, J2EE design patterns, frameworks
  • Should possess experience with at least one of middleware technologies (Application Servers, BPMS, BRMS, ESB & Message Brokers), Programming languages (e.g. Java/J2EE, JavaScript, COBOL, C), Operating Systems (e.g. Windows, Linux, MVS), and Databases (DB2, MySQL, No SQL Databases like MongoDB, Cassandra, Hadoop, etc.)
  • Must have experience implementing API Service architectures (SOAP, REST) using any of the market-leading API Management tools such as Apigee and frameworks such as Spring Boot for Microservices
  • Should have Advanced skills in implementing API Service architectures (SOAP, REST) using any of the market-leading API Management tools such as Apigee or similar frameworks such as Spring Boot for Microservices 
  • Appetite to manage large-scale projects and multiple tracks
  •  Experience and knowhow of the e-commerce domain and retail experience are preferred
  •  Good communication & people managerial skills
Read more
Celebal Technologies

at Celebal Technologies

2 recruiters
Payal Hasnani
Posted by Payal Hasnani
Jaipur, Noida, Gurugram, Delhi, Ghaziabad, Faridabad, Pune, Mumbai
5 - 15 yrs
₹7L - ₹25L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+4 more
Job Responsibilities:

• Project Planning and Management
o Take end-to-end ownership of multiple projects / project tracks
o Create and maintain project plans and other related documentation for project
objectives, scope, schedule and delivery milestones
o Lead and participate across all the phases of software engineering, right from
requirements gathering to GO LIVE
o Lead internal team meetings on solution architecture, effort estimation, manpower
planning and resource (software/hardware/licensing) planning
o Manage RIDA (Risks, Impediments, Dependencies, Assumptions) for projects by
developing effective mitigation plans
• Team Management
o Act as the Scrum Master
o Conduct SCRUM ceremonies like Sprint Planning, Daily Standup, Sprint Retrospective
o Set clear objectives for the project and roles/responsibilities for each team member
o Train and mentor the team on their job responsibilities and SCRUM principles
o Make the team accountable for their tasks and help the team in achieving them
o Identify the requirements and come up with a plan for Skill Development for all team
members
• Communication
o Be the Single Point of Contact for the client in terms of day-to-day communication
o Periodically communicate project status to all the stakeholders (internal/external)
• Process Management and Improvement
o Create and document processes across all disciplines of software engineering
o Identify gaps and continuously improve processes within the team
o Encourage team members to contribute towards process improvement
o Develop a culture of quality and efficiency within the team

Must have:
• Minimum 08 years of experience (hands-on as well as leadership) in software / data engineering
across multiple job functions like Business Analysis, Development, Solutioning, QA, DevOps and
Project Management
• Hands-on as well as leadership experience in Big Data Engineering projects
• Experience developing or managing cloud solutions using Azure or other cloud provider
• Demonstrable knowledge on Hadoop, Hive, Spark, NoSQL DBs, SQL, Data Warehousing, ETL/ELT,
DevOps tools
• Strong project management and communication skills
• Strong analytical and problem-solving skills
• Strong systems level critical thinking skills
• Strong collaboration and influencing skills

Good to have:
• Knowledge on PySpark, Azure Data Factory, Azure Data Lake Storage, Synapse Dedicated SQL
Pool, Databricks, PowerBI, Machine Learning, Cloud Infrastructure
• Background in BFSI with focus on core banking
• Willingness to travel

Work Environment
• Customer Office (Mumbai) / Remote Work

Education
• UG: B. Tech - Computers / B. E. – Computers / BCA / B.Sc. Computer Science
Read more
xlnc technologies

at xlnc technologies

1 recruiter
Shraddha  Rawal
Posted by Shraddha Rawal
Mumbai
2 - 5 yrs
₹4L - ₹8L / yr
ASP.NET
.NET
Big Data
PowerBI
Tableau
+1 more

Responsibilities

  • Manage and drive a team of Data Analysts and Sr. Data Analysts to provide logistics and supply chain solutions.
  • Conduct meetings with Clients to gather the requirements and understand the scope.
  • Conduct meetings with internal stake holders to walk them through the solution and handover the analysis.
  • Define business problems, identify solutions, provide analysis and insights from the client's data.
  • 5 Conduct scheduled progress reviews on all projects and interact with onsite team daily.
  • Ensure solutions are delivered error free and submitted on time.
  • Implement ETL processes using Pentaho Data Integration (Pentaho ETL)Design and implement data models in Hadoop.
  • Provide end-user training and technical assistance to maximize utilization of tools.
  • Deliver technical guidance to team, including hands-on development as necessary; oversee standards, change controls and documentation library for training and reuse.

Requirements

  • Bachelor's degree in Engineering.
  • 16+ years of experience in Supply Chain and logistics or related industry and Analytics experience.
  • 3 years of experience in team handling(8+People) and interacting with the executive leadership teams.
  • Strong project and time management skills with ability to multitask and prioritize workload.
  • Solid expertise with MS Excel, SQL, any visualization tools like Tableau/Power BI, any ETL tools.
  • Proficiency in Hadoop / Hive.
  • Experience Pentaho ETL, Pentaho Visualization API, Tableau.
  • Hands on experience of working with Big data sets (Data sets with millions of records).
  • Strong technical and Management experience.

 

Desired Skills and Experience

  • NET,ASP.NET
Read more
Mumbai
2 - 5 yrs
₹2L - ₹8L / yr
Data Warehouse (DWH)
Informatica
ETL
Microsoft Windows Azure
Big Data
+1 more
Responsible for the evaluation of cloud strategy and program architecture
2. Responsible for gathering system requirements working together with application architects
and owners
3. Responsible for generating scripts and templates required for the automatic provisioning of
resources
4. Discover standard cloud services offerings, install, and execute processes and standards for
optimal use of cloud service provider offerings
5. Incident Management on IaaS, PaaS, SaaS.
6. Responsible for debugging technical issues inside a complex stack involving virtualization,
containers, microservices, etc.
7. Collaborate with the engineering teams to enable their applications to run
on Cloud infrastructure.
8. Experience with OpenStack, Linux, Amazon Web Services, Microsoft Azure, DevOps, NoSQL
etc will be plus.
9. Design, implement, configure, and maintain various Azure IaaS, PaaS, SaaS services.
10. Deploy and maintain Azure IaaS Virtual Machines and Azure Application and Networking
Services.
11. Optimize Azure billing for cost/performance (VM optimization, reserved instances, etc.)
12. Implement, and fully document IT projects.
13. Identify improvements to IT documentation, network architecture, processes/procedures,
and tickets.
14. Research products and new technologies to increase efficiency of business and operations
15. Keep all tickets and projects updated and track time in a detailed format
16. Should be able to multi-task and work across a range of projects and issues with various
timelines and priorities
Technical:
• Minimum 1 year experience Azure and knowledge on Office365 services preferred.
• Formal education in IT preferred
• Experience with Managed Service business model a major plus
• Bachelor’s degree preferred
Read more
Zycus

at Zycus

10 recruiters
Siddharth Shilimkar
Posted by Siddharth Shilimkar
Mumbai, Bengaluru (Bangalore), Pune
14 - 20 yrs
₹15L - ₹40L / yr
Engineering Management
Engineering Manager
Engineering Director
Engineering Head
VP of Engineering
+32 more

We are looking for a Director of Engineering to lead one of our key product engineering teams. This role will report directly to the VP of Engineering and will be responsible for successful execution of the company's business mission through development of cutting-edge software products and solutions.

  • As an owner of the product you will be required to plan and execute the product road map and provide technical leadership to the engineering team.
  • You will have to collaborate with Product Management and Implementation teams and build a commercially successful product.
  • You will be responsible to recruit & lead a team of highly skilled software engineers and provide strong hands on engineering leadership.
  • Requirement deep technical knowledge in Software Product Engineering using Java/J2EE, Node.js, React.js, fullstack, NosqlDB, mongodb, cassandra, neo4j, elastic search, kibana, elk, kafka, redis, docker, kubernetes, apache, solr, activemq, rabbitmq, spark, scala, sqoop, hbase, hive, websocket, webcrawler, springboot, etc. is a must

Requirements

16+ years of experience in Software Engineering with at least 5+ years as an engineering leader in a software product company.

  • Hands-on technical leadership with proven ability to recruit high performance talent
  • High technical credibility - ability to audit technical decisions and push for the best solution to a problem.
  • Experience building E2E Application right from backend database to persistent layer.
  • Experience UI technologies Angular, react.js, Node.js or fullstack environment will be preferred.
  • Experience with NoSQL technologies (MongoDB, Cassandra, Neo4j, Dynamodb, etc.)
  • Elastic Search, Kibana, ELK, Logstash.
  • Experience in developing Enterprise Software using Agile Methodology.
  • Good understanding of Kafka, Redis, ActiveMQ, RabbitMQ, Solr etc.
  • SaaS cloud-based platform exposure.
  • Experience on Docker, Kubernetes etc.
  • Ownership E2E design development and also quality enterprise product/application deliverable exposure
  • A track record of setting and achieving high standards
  • Strong understanding of modern technology architecture
  • Key Programming Skills: Java, J2EE with cutting edge technologies
  • Excellent team building, mentoring and coaching skills are a must-have

Benefits

Five Reasons Why You Should Join Zycus

  1. Cloud Product Company: We are a Cloud SaaS Company and our products are created by using the latest technologies like ML and AI. Our UI is in Angular JS and we are developing our mobile apps using React.
  2. A Market Leader: Zycus is recognized by Gartner (world’s leading market research analyst) as a Leader in Procurement Software Suites.
  3. Move between Roles: We believe that change leads to growth and therefore we allow our employees to shift careers and move to different roles and functions within the organization
  4. Get a Global Exposure: You get to work and deal with our global customers.
  5. Create an Impact: Zycus gives you the environment to create an impact on the product and transform your ideas into reality. Even our junior engineers get the opportunity to work on different product features.

About Us

Zycus is a pioneer in Cognitive Procurement software and has been a trusted partner of choice for large global enterprises for two decades. Zycus has been consistently recognized by Gartner, Forrester, and other analysts for its Source to Pay integrated suite. Zycus powers its S2P software with the revolutionary Merlin AI Suite. Merlin AI takes over the tactical tasks and empowers procurement and AP officers to focus on strategic projects; offers data-driven actionable insights for quicker and smarter decisions, and its conversational AI offers a B2C type user-experience to the end-users.

Zycus helps enterprises drive real savings, reduce risks, and boost compliance, and its seamless, intuitive, and easy-to-use user interface ensures high adoption and value across the organization.

Start your #CognitiveProcurement journey with us, as you are #MeantforMore.

 

Click here to Apply :

 

https://apply.workable.com/zycus-1/j/D926111745/">Director of Engineering - Zycus (workable.com) - Mumbai.

https://apply.workable.com/zycus-1/j/90665BFD4C/">Director of Engineering - Zycus (workable.com) - Bengaluru.

https://apply.workable.com/zycus-1/j/3A5FBA2C7C/">Director of Engineering - Zycus (workable.com) - Pune.

 

Read more
Ganit Business Solutions

at Ganit Business Solutions

3 recruiters
Viswanath Subramanian
Posted by Viswanath Subramanian
Chennai, Bengaluru (Bangalore), Mumbai
4 - 6 yrs
₹7L - ₹15L / yr
SQL
Amazon Web Services (AWS)
Data Warehouse (DWH)
Informatica
ETL
+1 more

Responsibilities:

  • Must be able to write quality code and build secure, highly available systems.
  • Assemble large, complex datasets that meet functional / non-functional business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing datadelivery, re-designing infrastructure for greater scalability, etc with the guidance.
  • Create datatools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
  • Monitoring performance and advising any necessary infrastructure changes.
  • Defining dataretention policies.
  • Implementing the ETL process and optimal data pipeline architecture
  • Build analytics tools that utilize the datapipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.
  • Create design documents that describe the functionality, capacity, architecture, and process.
  • Develop, test, and implement datasolutions based on finalized design documents.
  • Work with dataand analytics experts to strive for greater functionality in our data
  • Proactively identify potential production issues and recommend and implement solutions

Skillsets:

  • Good understanding of optimal extraction, transformation, and loading of datafrom a wide variety of data sources using SQL and AWS ‘big data’ technologies.
  • Proficient understanding of distributed computing principles
  • Experience in working with batch processing/ real-time systems using various open-source technologies like NoSQL, Spark, Pig, Hive, Apache Airflow.
  • Implemented complex projects dealing with the considerable datasize (PB).
  • Optimization techniques (performance, scalability, monitoring, etc.)
  • Experience with integration of datafrom multiple data sources
  • Experience with NoSQL databases, such as HBase, Cassandra, MongoDB, etc.,
  • Knowledge of various ETL techniques and frameworks, such as Flume
  • Experience with various messaging systems, such as Kafka or RabbitMQ
  • Good understanding of Lambda Architecture, along with its advantages and drawbacks
  • Creation of DAGs for dataengineering
  • Expert at Python /Scala programming, especially for dataengineering/ ETL purposes
Read more
AI-powered Growth Marketing platform
Mumbai, Bengaluru (Bangalore)
2 - 7 yrs
₹8L - ₹25L / yr
Java
NOSQL Databases
MongoDB
Cassandra
Apache
+3 more
The Impact You Will Create
  • Build campaign generation services which can send app notifications at a speed of 10 million a minute
  • Dashboards to show Real time key performance indicators to clients
  • Develop complex user segmentation engines which creates segments on Terabytes of data within few seconds
  • Building highly available & horizontally scalable platform services for ever growing data
  • Use cloud based services like AWS Lambda for blazing fast throughput & auto scalability
  • Work on complex analytics on terabytes of data like building Cohorts, Funnels, User path analysis, Recency Frequency & Monetary analysis at blazing speed
  • You will build backend services and APIs to create scalable engineering systems.
  • As an individual contributor, you will tackle some of our broadest technical challenges that requires deep technical knowledge, hands-on software development and seamless collaboration with all functions.
  • You will envision and develop features that are highly reliable and fault tolerant to deliver a superior customer experience.
  • Collaborating various highly-functional teams in the company to meet deliverables throughout the software development lifecycle.
  • Identify and improvise areas of improvement through data insights and research.
What we look for?
  • 2-5 years of experience in backend development and must have worked on Java/shell/Perl/python scripting.
  • Solid understanding of engineering best practices, continuous integration, and incremental delivery.
  • Strong analytical skills, debugging and troubleshooting skills, product line analysis.
  • Follower of agile methodology (Sprint planning, working on JIRA, retrospective etc).
  • Proficiency in usage of tools like Docker, Maven, Jenkins and knowledge on frameworks in Java like spring, spring boot, hibernate, JPA.
  • Ability to design application modules using various concepts like object oriented, multi-threading, synchronization, caching, fault tolerance, sockets, various IPCs, database interfaces etc.
  • Hands on experience on Redis, MySQL and streaming technologies like Kafka producer consumers and NoSQL databases like mongo dB/Cassandra.
  • Knowledge about versioning like Git and deployment processes like CICD.
Read more
upGrad

at upGrad

1 video
19 recruiters
Priyanka Muralidharan
Posted by Priyanka Muralidharan
Mumbai, Bengaluru (Bangalore)
8 - 12 yrs
₹40L - ₹60L / yr
Technical Architecture
Technical architect
Java
Go Programming (Golang)
React.js
+10 more
About Us

upGrad is an online education platform building the careers of tomorrow by offering the most industry-relevant programs in an immersive learning experience. Our mission is to create a new digital-first learning experience to deliver tangible career impact to individuals at scale. upGrad currently offers programs in Data Science, Machine Learning, Product Management, Digital Marketing, and Entrepreneurship, etc. upGrad is looking for people passionate about management and education to help design learning programs for working professionals to stay sharp and stay relevant and help build the careers of tomorrow.
  • upGrad was awarded the Best Tech for Education by IAMAI for 2018-19,
  • upGrad was also ranked as one of the LinkedIn Top Startups 2018: The 25 most sought-after startups in India.
  • upGrad was earlier selected as one of the top ten most innovative companies in India by FastCompany.
  • We were also covered by the Financial Times along with other disruptors in Ed-Tech.
  • upGrad is the official education partner for Government of India - Startup India program.
  • Our program with IIIT B has been ranked #1 program in the country in the domain of Artificial Intelligence and Machine Learning.

About the Role

A highly motivated individual who has expe rience in architecting end to end web based ecommerce/online/SaaS products and systems; bringing them to production quickly and with high quality. Able to understand expected business results and map architecture to drive business forward. Passionate about building world class solutions.

Role and Responsibilities

  • Work with Product Managers and Business to understand business/product requirements and vision.
  • Provide a clear architectural vision in line with business and product vision.
  • Lead a team of architects, developers, and data engineers to provide platform services to other engineering teams.
  • Provide architectural oversight to engineering teams across the organization.
  • Hands on design and development of platform services and features owned by self - this is a hands-on coding role.
  • Define guidelines for best practices covering design, unit testing, secure coding etc.
  • Ensure quality by reviewing design, code, test plans, load test plans etc. as appropriate.
  • Work closely with the QA and Support teams to track quality and proactively identify improvement opportunities.
  • Work closely with DevOps and IT to ensure highly secure and cost optimized operations in the cloud.
  • Grow technical skills in the team - identify skill gaps with plans to address them, participate in hiring, mentor other architects and engineers.
  • Support other engineers in resolving complex technical issues as a go-to person.

Skills/Experience
  • 12+ years of experience in design and development of ecommerce scale systems and highly scalable SaaS or enterprise products.
  • Extensive experience in developing extensible and scalable web applications with
    • Java, Spring Boot, Go
    • Web Services - REST, OAuth, OData
    • Database/Caching - MySQL, Cassandra, MongoDB, Memcached/Redis
    • Queue/Broker services - RabbitMQ/Kafka
    • Microservices architecture via Docker on AWS or Azure.
    • Experience with web front end technologies - HTML5, CSS3, JavaScript libraries and frameworks such as jQuery, AngularJS, React, Vue.js, Bootstrap etc.
  • Extensive experience with cloud based architectures and how to optimize design for cost.
  • Expert level understanding of secure application design practices and a working understanding of cloud infrastructure security.
  • Experience with CI/CD processes and design for testability.
  • Experience working with big data technologies such as Spark/Storm/Hadoop/Data Lake Architectures is a big plus.
  • Action and result-oriented problem-solver who works well both independently and as part of a team; able to foster and develop others' ideas as well as his/her own.
  • Ability to organize, prioritize and schedule a high workload and multiple parallel projects efficiently.
  • Excellent verbal and written communication with stakeholders in a matrixed environment.
  • Long term experience with at least one product from inception to completion and evolution of the product over multiple years.
Qualification
B.Tech/MCA (IT/Computer Science) from a premier institution (IIT/NIT/BITS) and/or a US Master's degree in Computer Science.
Read more
Sportz Interactive
Remote, Mumbai, Navi Mumbai, Pune, Nashik
7 - 12 yrs
₹15L - ₹16L / yr
PostgreSQL
PL/SQL
Big Data
Optimization
Stored Procedures

Job Role : Associate Manager (Database Development)


Key Responsibilities:

  • Optimizing performances of many stored procedures, SQL queries to deliver big amounts of data under a few seconds.
  • Designing and developing numerous complex queries, views, functions, and stored procedures
  • to work seamlessly with the Application/Development team’s data needs.
  • Responsible for providing solutions to all data related needs to support existing and new
  • applications.
  • Creating scalable structures to cater to large user bases and manage high workloads
  • Responsible in every step from the beginning stages of the projects from requirement gathering to implementation and maintenance.
  • Developing custom stored procedures and packages to support new enhancement needs.
  • Working with multiple teams to design, develop and deliver early warning systems.
  • Reviewing query performance and optimizing code
  • Writing queries used for front-end applications
  • Designing and coding database tables to store the application data
  • Data modelling to visualize database structure
  • Working with application developers to create optimized queries
  • Maintaining database performance by troubleshooting problems.
  • Accomplishing platform upgrades and improvements by supervising system programming.
  • Securing database by developing policies, procedures, and controls.
  • Designing and managing deep statistical systems.

Desired Skills and Experience  :

  • 7+ years of experience in database development
  • Minimum 4+ years of experience in PostgreSQL is a must
  • Experience and in-depth knowledge in PL/SQL
  • Ability to come up with multiple possible ways of solving a problem and deciding on the most optimal approach for implementation that suits the work case the most
  • Have knowledge of Database Administration and have the ability and experience of using the CLI tools for administration
  • Experience in Big Data technologies is an added advantage
  • Secondary platforms: MS SQL 2005/2008, Oracle, MySQL
  • Ability to take ownership of tasks and flexibility to work individually or in team
  • Ability to communicate with teams and clients across time zones and global regions
  • Good communication and self-motivated
  • Should have the ability to work under pressure
  • Knowledge of NoSQL and Cloud Architecture will be an advantage
Read more
LoveLocal formerly mPaani
Ankit Sakhuja
Posted by Ankit Sakhuja
Mumbai, Bengaluru (Bangalore)
8 - 10 yrs
₹35L - ₹45L / yr
Engineering Management
Engineering Manager
Engineering head
Technical Architecture
Technical lead
+8 more

Key Responsibilities


  • Lead and inspire high-performing engineering teams to build and scale highly reliable, secure and responsive architecture for all LoveLocal products

  • Build highly Scalable platforms using Python, ReactJS, NodeJS, Kafka, Distributed Caching, SQL and No SQL Databases

  • Take full ownership of the people related aspects of the Engineering team engagement, recruiting, retention, compensation, promotions, individual performance management and organizational structure

  • Should be actively involved in hunting and recruiting new talent

  • Manage the engineering pipeline at all levels from planning the product implementation with the developers, distributing and managing work of the engineering team and ensure code quality across products and services

  • Should continuously plan and build for increasing scale of the product and ensure full hygiene of the tech infrastructure and code sanity

  • Should be the primary point of contact of functional heads for all tech dependencies

  • Take ownership of overall tech budget, reviewing expenses and planning for future costs

  • Work closely with CTO and Product Head on the Tech roadmap of the organisation

  • Become an integral part of the core team and be strongly aligned with the mission and values of the company

Main Qualifications

  • 8+ years of experience in Software Development with end-end implementation of products (Preferably in a product driven organisation)

  • Minimum 3 years of experience leading a team of at least 15 developers, either as an EM or a Tech Lead

  • Should have worked in a well scaled and fast paced working environment (Startups beyond Series B or a major tech organisation)

  • Comfortable handling multiple competing priorities in a fast-paced environment.

  • Well versed with cloud computing and infrastructure management AWS / GCP

  • Has worked extensively on designing and implementing Microservices architecture

  • Should have experience working with NoSQL data stores such as MongoDb, Cassandra, HBase, DynamoDB, etc.

  • Should have experience working with Big Data streaming services such as Kinesis, Kafka, RabbitMQ etc

  • Should be highly skilled in Python and have solid exposure to JavaScript

  • Should have solid exposure to Web development and have had familiarity working with Android and IOS engineers

  • Excellent understanding of software engineering practices and Design Patterns

  • A Bachelors or Masters in Computer Science or equivalent engineering degree from top engineering institutes. 

Read more
CoinDCX

at CoinDCX

5 recruiters
Sumit Gupta
Posted by Sumit Gupta
Mumbai, Bengaluru (Bangalore)
2 - 10 yrs
₹15L - ₹40L / yr
Ruby
Ruby on Rails (ROR)
NodeJS (Node.js)
Amazon Web Services (AWS)
Docker
+8 more

Job requirements

  • A strong engineer with excellent Ruby experience working with Ruby on Rails
  • Experience with Node.js
  • Experience with SQL/nosql databases(Postgresql, cassandra, MongoDB)
  • Experience with REST services and API design
  • Experience with building the system for scale
  • Experience with version control systems (bitbucket, git etc.)
  • Experience working with AWS
  • Experience with docker/microservices will be an added advantage
  • Knowledge of unit & integration testing
  • Knowledge of agile development process, jira
  • Strong knowledge of algorithms and Data structures
  • Basic understanding of the HTTP protocol
  • Demonstrated experience working on application development projects and test-driven development. Experience in writing high quality code
  • Knowledge of blockchain technology, smart contracts and cryptocurrency will be an added advantage
  • Experience in fintech domain will be another added advantage
  • Bachelor’s degree in computer programming, computer science, or a related field.
  • Fluency or understanding of specific languages, such as Java, PHP, or Python, and operating systems may be required.
  • 3+ years of experience with Ruby On Rails.
  • Strong Project & Time Management Skills, along with the ability to apply these skills while working independently, or as part of a team.
Read more
Codalyze Technologies

at Codalyze Technologies

4 recruiters
Aishwarya Hire
Posted by Aishwarya Hire
Mumbai
3 - 9 yrs
₹5L - ₹12L / yr
Apache Hive
Hadoop
Scala
Spark
Amazon Web Services (AWS)
+2 more
Job Overview :

Your mission is to help lead team towards creating solutions that improve the way our business is run. Your knowledge of design, development, coding, testing and application programming will help your team raise their game, meeting your standards, as well as satisfying both business and functional requirements. Your expertise in various technology domains will be counted on to set strategic direction and solve complex and mission critical problems, internally and externally. Your quest to embracing leading-edge technologies and methodologies inspires your team to follow suit.

Responsibilities and Duties :

- As a Data Engineer you will be responsible for the development of data pipelines for numerous applications handling all kinds of data like structured, semi-structured &
unstructured. Having big data knowledge specially in Spark & Hive is highly preferred.

- Work in team and provide proactive technical oversight, advice development teams fostering re-use, design for scale, stability, and operational efficiency of data/analytical solutions

Education level :

- Bachelor's degree in Computer Science or equivalent

Experience :

- Minimum 3+ years relevant experience working on production grade projects experience in hands on, end to end software development

- Expertise in application, data and infrastructure architecture disciplines

- Expert designing data integrations using ETL and other data integration patterns

- Advanced knowledge of architecture, design and business processes

Proficiency in :

- Modern programming languages like Java, Python, Scala

- Big Data technologies Hadoop, Spark, HIVE, Kafka

- Writing decently optimized SQL queries

- Orchestration and deployment tools like Airflow & Jenkins for CI/CD (Optional)

- Responsible for design and development of integration solutions with Hadoop/HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions

- Knowledge of system development lifecycle methodologies, such as waterfall and AGILE.

- An understanding of data architecture and modeling practices and concepts including entity-relationship diagrams, normalization, abstraction, denormalization, dimensional
modeling, and Meta data modeling practices.

- Experience generating physical data models and the associated DDL from logical data models.

- Experience developing data models for operational, transactional, and operational reporting, including the development of or interfacing with data analysis, data mapping,
and data rationalization artifacts.

- Experience enforcing data modeling standards and procedures.

- Knowledge of web technologies, application programming languages, OLTP/OLAP technologies, data strategy disciplines, relational databases, data warehouse development and Big Data solutions.

- Ability to work collaboratively in teams and develop meaningful relationships to achieve common goals

Skills :

Must Know :

- Core big-data concepts

- Spark - PySpark/Scala

- Data integration tool like Pentaho, Nifi, SSIS, etc (at least 1)

- Handling of various file formats

- Cloud platform - AWS/Azure/GCP

- Orchestration tool - Airflow
Read more
Codalyze Technologies

at Codalyze Technologies

4 recruiters
Aishwarya Hire
Posted by Aishwarya Hire
Mumbai
3 - 7 yrs
₹7L - ₹20L / yr
Hadoop
Big Data
Scala
Spark
Amazon Web Services (AWS)
+3 more
Job Overview :

Your mission is to help lead team towards creating solutions that improve the way our business is run. Your knowledge of design, development, coding, testing and application programming will help your team raise their game, meeting your standards, as well as satisfying both business and functional requirements. Your expertise in various technology domains will be counted on to set strategic direction and solve complex and mission critical problems, internally and externally. Your quest to embracing leading-edge technologies and methodologies inspires your team to follow suit.

Responsibilities and Duties :

- As a Data Engineer you will be responsible for the development of data pipelines for numerous applications handling all kinds of data like structured, semi-structured &
unstructured. Having big data knowledge specially in Spark & Hive is highly preferred.

- Work in team and provide proactive technical oversight, advice development teams fostering re-use, design for scale, stability, and operational efficiency of data/analytical solutions

Education level :

- Bachelor's degree in Computer Science or equivalent

Experience :

- Minimum 5+ years relevant experience working on production grade projects experience in hands on, end to end software development

- Expertise in application, data and infrastructure architecture disciplines

- Expert designing data integrations using ETL and other data integration patterns

- Advanced knowledge of architecture, design and business processes

Proficiency in :

- Modern programming languages like Java, Python, Scala

- Big Data technologies Hadoop, Spark, HIVE, Kafka

- Writing decently optimized SQL queries

- Orchestration and deployment tools like Airflow & Jenkins for CI/CD (Optional)

- Responsible for design and development of integration solutions with Hadoop/HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions

- Knowledge of system development lifecycle methodologies, such as waterfall and AGILE.

- An understanding of data architecture and modeling practices and concepts including entity-relationship diagrams, normalization, abstraction, denormalization, dimensional
modeling, and Meta data modeling practices.

- Experience generating physical data models and the associated DDL from logical data models.

- Experience developing data models for operational, transactional, and operational reporting, including the development of or interfacing with data analysis, data mapping,
and data rationalization artifacts.

- Experience enforcing data modeling standards and procedures.

- Knowledge of web technologies, application programming languages, OLTP/OLAP technologies, data strategy disciplines, relational databases, data warehouse development and Big Data solutions.

- Ability to work collaboratively in teams and develop meaningful relationships to achieve common goals

Skills :

Must Know :

- Core big-data concepts

- Spark - PySpark/Scala

- Data integration tool like Pentaho, Nifi, SSIS, etc (at least 1)

- Handling of various file formats

- Cloud platform - AWS/Azure/GCP

- Orchestration tool - Airflow
Read more
Remote, Mumbai
10 - 18 yrs
₹30L - ₹55L / yr
Scala
Big Data
Java
Amazon Web Services (AWS)
ETL

What's the role?

Your role as a Principal Engineer will involve working with various team. As a principal engineer, will need full knowledge of the software development lifecycle and Agile methodologies. You will demonstrate multi-tasking skills under tight deadlines and constraints. You will regularly contribute to the development of work products (including analyzing, designing, programming, debugging, and documenting software) and may work with customers to resolve challenges and respond to suggestions for improvements and enhancements. You will setup the standard and principal for the product he/she drives.

  • Setup coding practice, guidelines & quality of the software delivered.
  • Determines operational feasibility by evaluating analysis, problem definition, requirements, solution development, and proposed solutions.
  • Documents and demonstrates solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code.
  • Prepares and installs solutions by determining and designing system specifications, standards, and programming.
  • Improves operations by conducting systems analysis; recommending changes in policies and procedures.
  • Updates job knowledge by studying state-of-the-art development tools, programming techniques, and computing equipment; participating in educational opportunities; reading professional publications; maintaining personal networks; participating in professional organizations.
  • Protects operations by keeping information confidential.
  • Develops software solutions by studying information needs; conferring with users; studying systems flow, data usage, and work processes; investigating problem areas; following the software development lifecycle. Who are you? You are a go-getter, with an eye for detail, strong problem-solving and debugging skills, and having a degree in BE/MCA/M.E./ M Tech degree or equivalent degree from reputed college/university.

 

Essential Skills / Experience:

  • 10+ years of engineering experience
  • Experience in designing and developing high volume web-services using API protocols and data formats
  • Proficient in API modelling languages and annotation
  • Proficient in Java programming
  • Experience with Scala programming
  • Experience with ETL systems
  • Experience with Agile methodologies
  • Experience with Cloud service & storage
  • Proficient in Unix/Linux operating systems
  • Excellent oral and written communication skills Preferred:
  • Functional programming languages (Scala, etc)
  • Scripting languages (bash, Perl, Python, etc)
  • Amazon Web Services (Redshift, ECS etc)
Read more
Chalo

at Chalo

1 video
3 recruiters
Damini Gawali
Posted by Damini Gawali
Seawoo, Navi Mumbai, Mumbai
1 - 7 yrs
₹15L - ₹30L / yr
Java
Data Structures
PHP
Python
Ruby on Rails (ROR)
+9 more

Responsibilities:

  • Own end to end development and operations of high-performance Spring Hibernate Applications.
  • Design the architecture and deliver clean, testable and scalable code
  • Participate in requirement gathering and display a strong sense of ownership and delivery

 

Skills and Qualifications:

  • Strong in Data Structures, algorithms and Object Oriented Concepts, Message Queues and Caching
  • BE/ B.Tech preferred
Read more
Globant

at Globant

2 recruiters
Risha P
Posted by Risha P
Mumbai
5 - 10 yrs
₹12L - ₹20L / yr
Object Oriented Programming (OOPs)
Shell Scripting
Java
SOAP
JSON
+8 more
We are looking for a Java developer for one of our major investment banking client- who can take ownership for the whole end to end delivery, performing analysis, design, coding, testing and maintenance of large- scale and distributed applications. Please find JD for your reference . Job Profile : Java Developer : Location : Mumbai Description: A core Java developer is required for a Tier 1 investment bank supporting the Delta One Structured Products IT group. This is a global front-office team that supports the global OTC Equity Swap Portfolio, Single Name, and Index derivative businesses. We are designing a complete restructure of the Equity Swaps trading platform, and this particular role is within the core cash flow and valuations area. The role will require the candidate to work closely with the cash flow engines team to solve problems that combine both finance and technology. This is an exciting hands-on role for a self-starter who has a thirst for new challenges as well as new technologies. The candidate should possess good analytical skills, strong software engineering skills, a logical approach to problem-solving, be able to work in a fast paced environment liaising with demanding stakeholders to understand complex requirements and be able to prioritize work under pressure with minimal supervision. The candidate should be a problem solver, and be able to bring with them some positivity and enthusiasm in trying to think about and offer potential solutions for architectural considerations. Position Profile: We are looking for someone to help own problems and be able to demonstrate leadership and responsibility for the delivery of new features. As part of the development cycle, you would be expected to write quality unit tests, supply documentation if relevant for new feature build-outs, and be involved in the test cycle (UAT, integration, regression) for the delivery and fixing of bugs for your new features. Although the role is predominantly Java, we require someone who is flexible with the development environment, as some days you might be writing Java, and other days you might be fixing stored procedures or Perl scripts. You would be expected to get involved in the Level 3 production support rota which is shared between our developers on a monthly cycle, and to occasionally help with weekend deployment activities to deploy and verify any code changes you have been involved in. Team Profile: The team and role are ideal for someone looking for a strong career development path with many opportunities to grow, learn and develop. The role requires someone who is flexible and able to respond to a dynamic business environment. The candidate must be adaptable to work across multiple technologies and disciplines, with a focus on delivering quality solutions for the business in a timely fashion. This role suits people experienced in complex data domains. Required Skills: * Experience of agile and scrum methodologies. * Core Java. * Unix shell scripting. * SQL and Relational Databases such as DB2. * Integration technologies - MQ/Xml/SOAP/JSON/Protocol Buffers/Spring. * Enterprise Architecture Patterns, GoF design * Build & agile - Ant, Gradle/Maven, Sonar, Jenkins/Hudson, GIT/perforce. * Sound understanding of Object Oriented Analysis, Design and Programming. * Strong communication and stakeholder management skills * Scala / spark or bigdata will be an added advantage * Candidate must have good experience in database. * Excellent communication and problem solving skill. Desired Skills: * Experience in banking and regulatory reporting (SFTR, MAS/ASIC etc.) * Knowledge of OTC, listed and cash products * Domain driven design and micro-services
Read more
Niyuj Enterprise Software Solutions Pvt. Ltd.
Navi Mumbai, Pune
3 - 6 yrs
₹5L - ₹9L / yr
Linux/Unix
PostgreSQL
Ruby
Web Analytics
Ruby on Rails (ROR)
+2 more
Dear Candidate, Please find below details : Ruby on Rails Developer Years of experience- 3 to 6 years Required Skills Ruby, Ruby on Rails, Experience in developing Web application using Ruby, RoR Databases: PostgreSQL Added advantages if candidates knows REST OS: Linux Please share your details across [email protected] with below details  Total Exp: Rel Exp: Current CTC: Expected CTC: Notice Period: Niyuj is a product engineering company that engages with the customer at different levels in the product development lifecycle in order to build quality products, on budget and on time. Founded in 2007 by passionate technology leader, Stable and seasoned leadership with hands-on experience working or consulting companies from bootstrapped start-ups to large multinationals. Global experience in US, Australia & India, Worked with fortune 500 companies to prominent startups, clients include Symantec, Vmware, Carbonite, Edgewater networks Domain Areas we work for : CLOUD SERVICES - Enterprises are rushing to incorporate cloud computing, big data, and mobile into their IT infrastructures. BIG-DATA ANALYTICS - Revolutionizing the way Fortune 1000 companies harness billions of data and turn it into a competitive advantage. NETWORK AND SECURITY - Network and security-related system level work that meets customer demands and deliver real value Our Prime customer, Carbonite, is Americas #1 cloud backup and Storage Company, with over 1.5 million customers and headquartered in Boston MA, with offices in 15 locations across the world. Your potential for exponential growth: Your experience and expertise would be a great addition to our team, and you will have an opportunity to work closely with industry leaders, literally sitting across the table and jointly building the future with folks who are noted gurus and industry veterans from prestigious institutions like IIT's and top US universities with industry experience in fortune 500 companies like EMC, Symantec and VERITAS.
Read more
Mintifi

at Mintifi

3 recruiters
Suchita Upadhyay
Posted by Suchita Upadhyay
Mumbai
2 - 4 yrs
₹6L - ₹15L / yr
Big Data
Hadoop
MySQL
MongoDB
YARN
Job Title: Software Developer – Big Data Responsibilities We are looking for a Big Data Developer who can drive innovation and take ownership and deliver results. • Understand business requirements from stakeholders • Build & own Mintifi Big Data applications • Be heavily involved in every step of the product development process, from ideation to implementation to release. • Design and build systems with automated instrumentation and monitoring • Write unit & integration tests • Collaborate with cross functional teams to validate and get feedback on the efficacy of results created by the big data applications. Use the feedback to improve the business logic • Proactive approach to turn ambiguous problem spaces into clear design solutions. Qualifications • Hands-on programming skills in Apache Spark using Java or Scala • Good understanding about Data Structures and Algorithms • Good understanding about relational and non-relational database concepts (MySQL, Hadoop, MongoDB) • Experience in Hadoop ecosystem components like YARN, Zookeeper would be a strong plus
Read more
Fractal Analytics

at Fractal Analytics

5 recruiters
Mili Panicker
Posted by Mili Panicker
Mumbai
2 - 5 yrs
₹7L - ₹20L / yr
Javascript
HTML/CSS
React.js
AngularJS (1.x)
MySQL
+3 more
Role Brief :2-3 years of experience in JavaScript (including ES2015), HTML5, SVG, CSS3/SCSS. The engineer will use his/her skills & experience to translate graphical designs to delightful implementations, manage and enhance the application user experience, and help train other developers.Brief about the Team & Fractal :Fractal Analytics is Leading Fortune 500 companies to leverage Big Data, analytics, and technology to drive smarter, faster and more accurate decisions in every aspect of their business.Trial Run is a cloud-based product that is helping business to solve bigger enterprise problem.Our team is a mix of passionate data scientists, engineers, product managers and domain experts.Together, our mission is to build world-class products to help our customers create a culture of experimentation.Job Responsibilities :- Review designs created by web designers, clarify issues and implement the designs- Implement intuitive and interactive visualizations to present analytical insights to users- Collaborate closely with designers, engineers, client support, and data scientists to implement and improve the functionality of the application.- Train other developers on new technologies.- Follow and introduce industry best practices in software development lifecycle.- Optimize and debug applications.- Maintain updated knowledge of the development industry and any advancements in technologyExperience :Must Have : 2-3 years of experience in JavaScript (including ES2015), HTML5, SVG, CSS3/SCSS Good to Have : Proficiency working with JavaScript libraries, Server frameworks, control systems, testing libraries, testing libraries, Rest APIsEducation : Any
Read more
Pion Global Solutions LTD
Sheela P
Posted by Sheela P
Mumbai
3 - 100 yrs
₹4L - ₹15L / yr
Spark
Big Data
Hadoop
HDFS
Apache Sqoop
+2 more
Looking for Big data Developers in Mumbai Location
Read more
Arque Capital

at Arque Capital

2 recruiters
Hrishabh Sanghvi
Posted by Hrishabh Sanghvi
Mumbai
5 - 11 yrs
₹15L - ₹30L / yr
C++
Big Data
Technical Architecture
Cloud Computing
Python
+4 more
ABOUT US: Arque Capital is a FinTech startup working with AI in Finance in domains like Asset Management (Hedge Funds, ETFs and Structured Products), Robo Advisory, Bespoke Research, Alternate Brokerage, and other applications of Technology & Quantitative methods in Big Finance. PROFILE DESCRIPTION: 1. Get the "Tech" in order for the Hedge Fund - Help answer fundamentals of technology blocks to be used, choice of certain platform/tech over other, helping team visualize product with the available resources and assets 2. Build, manage, and validate a Tech Roadmap for our Products 3. Architecture Practices - At startups, the dynamics changes very fast. Making sure that best practices are defined and followed by team is very important. CTO’s may have to garbage guy and clean the code time to time. Making reviews on Code Quality is an important activity that CTO should follow. 4. Build progressive learning culture and establish predictable model of envisioning, designing and developing products 5. Product Innovation through Research and continuous improvement 6. Build out the Technological Infrastructure for the Hedge Fund 7. Hiring and building out the Technology team 8. Setting up and managing the entire IT infrastructure - Hardware as well as Cloud 9. Ensure company-wide security and IP protection REQUIREMENTS: Computer Science Engineer from Tier-I colleges only (IIT, IIIT, NIT, BITS, DHU, Anna University, MU) 5-10 years of relevant Technology experience (no infra or database persons) Expertise in Python and C++ (3+ years minimum) 2+ years experience of building and managing Big Data projects Experience with technical design & architecture (1+ years minimum) Experience with High performance computing - OPTIONAL Experience as a Tech Lead, IT Manager, Director, VP, or CTO 1+ year Experience managing Cloud computing infrastructure (Amazon AWS preferred) - OPTIONAL Ability to work in an unstructured environment Looking to work in a small, start-up type environment based out of Mumbai COMPENSATION: Co-Founder status and Equity partnership
Read more
LogiNext

at LogiNext

1 video
7 recruiters
Avi Sisodia
Posted by Avi Sisodia
Mumbai
5 - 9 yrs
₹17L - ₹25L / yr
Big Data
Hibernate (Java)
Spring
LogiNext is looking for a technically savvy and experienced technical architect to serve as the lead, and mentor for a growing team of strong developers. You will help the team grow in size and skills, optimizing their code while working on your owl. With your technical expertise you will manage priorities, deadlines and deliverables, identify and mitigate the risks. You will design and develop the products that exceed client expectations in terms of value and benefit. You have deep expertise in building secure, high-performing and scalable systems in Java. You have successfully managed complex and cross discipline Big Data projects in the past. Your design intuition inclines towards usability, elegance and simplicity. You have successfully shipped applications with beautiful front-end and intelligent backend. You have demonstrated strong interpersonal and communication skills. Responsibilities • Lead end-to-end design and development of cutting-edge products • Work with product management and engineering team to build highly scalable products • Keep an eye out for open source projects and technology trends that can be introduced in the products • Be hands-on, adopt practical approach to software and technology • Work closely with development team to explain the requirements and constantly monitor the progress • Suggest improvements in systems & processes and assist technical team with issues needing technical expertise • Create technical content such as blogs, technical specification documents and system integration requirements documents Requirements • Master’s or Bachelor’s in Computer Science, Information Technology, Info Systems, or related field • 5+ years of relevant experience in designing and developing scalable and distributed enterprise applications • Expertise in common frameworks like Spring, Hibernate, RESTful Web Services, etc. and managing and optimizing data stores such as MySQL, MongoDB, Elasticsearch, etc • Experience in front-end tools and technologies (HTML5, CSS, JavaScript, jQuery, etc) and Geographic Information System (GIS) is preferred • Experience in building the configuration details for installation, deployment and configuration of cloud automation solutions on AWS • Strong foundation in computer science, with strong competencies in data structures, algorithms and software design • Proven ability to drive large scale projects with deep understanding of Agile SDLC, high collaboration and leadership • Excellent written and oral communication skills, judgment and decision making skills, and the ability to work under continual deadline pressure • Excellent written and oral communication skills, judgment and decision making skills, and the ability to work under continual deadline pressure
Read more
Nextalytics Software Services Pvt Ltd
Harshal Patni
Posted by Harshal Patni
Navi Mumbai
4 - 8 yrs
₹5L - ₹15L / yr
R
Artificial Neural Networks
UIMA
Python
Big Data
+3 more
Nextalytics is an offshore research, development and consulting company based in India that focuses on high quality and cost effective software development and data science solutions. At Nextalytics, we have developed a culture that encourages employees to be creative, innovative, and playful. We reward intelligence, dedication and out-of-the-box thinking; if you have these, Nextalytics will be the perfect launch pad for your dreams. Nextalytics is looking for smart, driven and energetic new team members.
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort