Loading...

{{notif_text}}

The next CutShort event, {{next_cs_event.name}}, in partnership with UpGrad, will happen on {{next_cs_event.startDate | date: 'd MMMM'}}The next CutShort event, {{next_cs_event.name}}, in partnership with UpGrad, will begin in a few hoursThe CutShort event, {{next_cs_event.name}}, in partnership with UpGrad, is LIVE.Join now!
{{hours_remaining}}:{{minutes_remaining}}:{{seconds_remaining}}Want to save 90% of your recruiting time? Learn how in our next webinar on 22nd March at 3 pmLearn more

Big data Jobs in Delhi, NCR and Gurgaon

Explore top Big data Job opportunities in Delhi, NCR and Gurgaon for Top Companies & Startups. All jobs are added by verified employees who can be contacted directly below.

Software Engineer

Founded 2013
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
NCR (Delhi | Gurgaon | Noida)
Experience icon
2 - 11 years
Experience icon
Best in industry8 - 22 lacs/annum

Job Description : We are looking for someone who can work with the platform or analytics vertical to extend and scale our product-line. Every product line has the dependency on other products within LimeTray eco-system and SSE- 2 is expected to collaborate with different internal teams to own stable and scalable releases. While every product-line has their own tech stack - different products have different technologies and it's expected that Person is comfortable working across all of them as and when needed. Some of the technologies/frameworks that we work on - Microservices, Java, Node, MySQL, MongoDB, Angular, React, Kubernetes, AWS, Python Requirements : - Minimum 3-year work experience in building, managing and maintaining Python based backend applications - B.Tech/BE in CS from Tier 1/2 Institutes - Strong Fundamentals of Data Structures and Algorithms - Experience in Python & Design Patterns - Expert in git, unit tests, technical documentation and other development best practises - Worked with SQL & NoSQL databases (Cassandra, MYSQL) - Understanding of async programming. Knowledge in handling messaging services like pubsub or streaming (Eg: Kafka, ActiveMQ, RabbitMQ) - Understanding of Algorithm, Data structures & Server Management - Understanding microservice or distributed architecture - Delivered high-quality work with a significant contribution - Experience in Handling small teams - Has good debugging skills - Has good analytical & problem-solving skills What we are looking for : - Ownership Driven - Owns end to end development - Team Player - Works well in a team. Collaborates with & outside the team. - Communication - Speaks and writes clearly and articulately. Maintains this standard in all forms of written communication including email. - Proactive & Persistence - Acts without being told to and demonstrates a willingness to go the distance to get something done - Develops emotional bonding for the product and does what is good for the product. - Customer first mentality. Understands customers pain and works towards the solutions. - Honest & always keeps high standards. - Expects the same form the team - Strict on Quality and Stability of the product.

Job posted by
apply for job
apply for job
Bhavya Jain picture
Bhavya Jain
Job posted by
Bhavya Jain picture
Bhavya Jain
Apply for job
apply for job

Software Engineer

Founded 2013
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
NCR (Delhi | Gurgaon | Noida)
Experience icon
4 - 8 years
Experience icon
Best in industry10 - 25 lacs/annum

Job Description We are looking for someone who can work in the Analytics vertical to extend and scale our product-line. Every product line has the dependency on Analytics vertical within LimeTray eco-system and SSE-2 is expected to collaborate with internal teams to own stable and scalable releases. While every product-line has their own tech stack and it's expected that candidate is comfortable working across all of them as and when needed. Some of the technologies/frameworks that we work on - Microservices, Java, Node, Python, MySQL, MongoDB, Cassandra, Angular, React, Kubernetes, AWS Requirements: Minimum 4-years work experience in building, managing and maintaining Analytics applications B.Tech/BE in CS/IT from Tier 1/2 Institutes Experience in Handling small teams Strong Fundamentals of Data Structures and Algorithms Good analytical & problem-solving skills Strong hands-on experience in Python In-depth Knowledge of queueing systems (Kafka/ActiveMQ/RabbitMQ) Experience in building Data pipelines & Real-time Analytics Systems Experience in SQL (MYSQL) & NoSQL (Mongo/Cassandra) databases is a plus Understanding of Service Oriented Architecture Delivered high-quality work with a significant contribution Expert in git, unit tests, technical documentation and other development best practices What we are looking for: Customer first mentality. Develops emotional bonding for the product and does right prioritization. Ownership Driven - Owns end to end development Proactive & Persistence - Acts without being told to and demonstrates a willingness to go the distance to get something done Strict on Quality and Stability of the product. Communication - Speaks and writes clearly and articulately. Maintains this standard in all forms of written communication. Team Player

Job posted by
apply for job
apply for job
Bhavya Jain picture
Bhavya Jain
Job posted by
Bhavya Jain picture
Bhavya Jain
Apply for job
apply for job

DevOps Engineer

Founded 2013
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
NCR (Delhi | Gurgaon | Noida)
Experience icon
3 - 7 years
Experience icon
Best in industry10 - 15 lacs/annum

As a DevOps Engineer, you will be responsible for managing and building upon the infrastructure that supports our data intelligence platform. You'll also be involved in building tools and establishing processes to empower developers to deploy and release their code seamlessly.Across teams, we will look up to you to make key decisions for our infrastructure, networking and security. You will also own, scale, and maintain the compute and storage infrastructure for various product teams.The ideal DevOps Engineers possess a solid understanding of system internals and distributed systems. They understand what it takes to work in a startup environment and have the zeal to establish a culture of infrastructure awareness and transparency across teams and products. They fail fast, learn faster, and execute almost instantly.Technology Stack: Configuration management tools (Ansible/Chef/Puppet), Cloud Service Providers (AWS/DigitalOcean), Docker+Kubernetes ecosystem is a plus.WHY YOU?* Because you love to take ownership of the infrastructure allowing developers to deploy and manage microservices at scale.* Because you love tinkering with and building upon new tools and technologies to make your work easier and streamlined with the industry's best practices.* Because you have the ability to analyze and optimize performance in high-traffic internet applications.* Because you take pride in building scalable and fault-tolerant infrastructural systems.* Because you see explaining complex engineering concepts and design decisions to the less tech savvy as an interesting challenge.IN MONTH 1, YOU WILL...* Learn about the products and internal tools that power our data intelligence platform.* Understand the underlying infrastructure and play around with the tools used to manage it.* Get familiar with the current architectural challenges arising from handling data and web traffic at scale.IN MONTH 3, YOU WILL...* Become an integral part of the architectural decisions taken across teams and products.* Play a pivotal role in establishing a culture of infrastructure awareness and transparency across the company.* Become the go-to person for engineers to get help solving issues with performance and scale.IN MONTH 6 (AND BEYOND), YOU WILL...* Hire a couple of engineers to strengthen the team and build systems to help manage high-volume and high-velocity data.* Be involved, along with the DevOps team, in tasks ranging from tracking statistics and managing alerts to deploying new hosts and debugging intricate production issues.About SocialCopsSocialCops is a data intelligence company that is empowering leaders in organizations globally including the United Nations & Unilever. Our platform powers over 150 organizations across 28 countries. As a pioneering tech startup, SocialCops was recognized in the list of Technology Pioneers 2018 by World Economic Forum and by the New York Times in the list of 30 global visionaries. We were also part of the Google Launchpad Accelerator 2018. Aasaan jobs named SocialCops as one of the best Indian startups to work for in 2018.Read more about our work and case studies: https://socialcops.com/case-studies/Watch our co-founder's TEDx talk on how big data can influence decisions that matter: https://www.youtube.com/watch?v=C6WKt6fJisoWant to know how much impact you can drive in under a year at SocialCops? See our 2017 year in review: https://socialcops.com/2017/For more information on our hiring process, check out our blog: https://blog.socialcops.com/inside-sc/team-culture/interested-joining-socialcops-team-heres-need/

Job posted by
apply for job
apply for job
Dharmik Gohel picture
Dharmik Gohel
Job posted by
Dharmik Gohel picture
Dharmik Gohel
Apply for job
apply for job

Back-End Developer

Founded 2013
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
NCR (Delhi | Gurgaon | Noida)
Experience icon
3 - -2 years
Experience icon
Best in industry6 - 15 lacs/annum

As a Back-End Developer, you will serve as the gatekeeper to all data at SocialCops. You will abstract away complex data interactions with easy-to-use APIs that will power web applications. While data should be easily accessible to those who need it, you must also ensure that it does not fall into the wrong hands.The ideal Back-End Developers are polyglots who are fluent in HTTP and always pick the right set of tools for the job at hand. They understand what it takes to work in a startup environment and know when to trade performance for development speed. They are critical to the success of many products at SocialCops. They fail fast, learn faster, and execute almost instantly.Interested, but not ready to apply just yet? Sign up for updates from our team, and we'll keep you updated about how we're growing and tell you about future opportunities as they come up!Last date to apply: 31st August. Applications are processed on a rolling basis, so apply as soon as you can! WHY US?* Work with a team of engineers, growth marketers, designers, and entrepreneurs, all united by a common mission to drive the most important decisions that will affect humanity.* Work on complex data problems.* Our fail wall will be your claim to fame.KEY RESPONSIBILITIES* Analyze, design and develop tests and test-automation suites.* Design and develop a processing platform using various configuration management technologies.* Test software development methodology in an agile environment.* Provide ongoing maintenance, support and enhancements in existing systems and platforms.* Collaborate cross-functionally with data scientists, business users, project managers and other engineers to create elegant solutions.* Provide recommendations for continuous improvement.* Work alongside other engineers on the team to consistently apply best practices and improve our technology.BASIC QUALIFICATIONS* A Bachelor’s degree in Computer Science or equivalent combination of technical education and work experience.* Strong, object-oriented design and coding skills (C/C++ and/or Java/C# on a UNIX or Linux/Windows platform).* Solid software development background, including design patterns, data structures, and test-driven development.* Solid experience with distributed (multi-tiered) systems, algorithms, and relational databases.* Software development experience in AWS, S3, building web services and highly scalable applications or equivalent.* Proficiency using modern web development technologies and techniques, including JavaScript, Python, web services, etc.* Strong analytical skills, with excellent problem-solving abilities.* Strong customer focus, ownership, urgency and drive.* Leadership qualities of hustle, curiosity and generosity.* Excellent verbal and written communication skills.About SocialCopsSocialCops is a data intelligence company that is empowering leaders in organizations globally including the United Nations & Unilever. Our platform powers over 150 organizations across 28 countries. As a pioneering tech startup, SocialCops was recognized in the list of Technology Pioneers 2018 by World Economic Forum and by the New York Times in the list of 30 global visionaries. We were also part of the Google Launchpad Accelerator 2018. Aasaan jobs named SocialCops as one of the best Indian startups to work for in 2018.Read more about our work and case studies: https://socialcops.com/case-studies/Watch our co-founder's TEDx talk on how big data can influence decisions that matter: https://www.youtube.com/watch?v=C6WKt6fJisoWant to know how much impact you can drive in under a year at SocialCops? See our 2017 year in review: https://socialcops.com/2017/For more information on our hiring process, check out our blog: https://blog.socialcops.com/inside-sc/team-culture/interested-joining-socialcops-team-heres-need/

Job posted by
apply for job
apply for job
Dharmik Gohel picture
Dharmik Gohel
Job posted by
Dharmik Gohel picture
Dharmik Gohel
Apply for job
apply for job

Senior Specialist - BigData Engineering

Founded 2000
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai, NCR (Delhi | Gurgaon | Noida), Bengaluru (Bangalore)
Experience icon
5 - 10 years
Experience icon
Best in industry15 - 35 lacs/annum

Role Brief: 6 + years of demonstrable experience designing technological solutions to complex data problems, developing & testing modular, reusable, efficient and scalable code to implement those solutions. Brief about Fractal & Team : Fractal Analytics is Leading Fortune 500 companies to leverage Big Data, analytics, and technology to drive smarter, faster and more accurate decisions in every aspect of their business.​​​​​​​ Our Big Data capability team is hiring technologists who can produce beautiful & functional code to solve complex analytics problems. If you are an exceptional developer and who loves to push the boundaries to solve complex business problems using innovative solutions, then we would like to talk with you.​​​​​​​ Job Responsibilities : Provides technical leadership in BigData space (Hadoop Stack like M/R, HDFS, Pig, Hive, HBase, Flume, Sqoop, etc..NoSQL stores like Cassandra, HBase etc) across Fractal and contributes to open source Big Data technologies. Visualize and evangelize next generation infrastructure in Big Data space (Batch, Near RealTime, RealTime technologies). Evaluate and recommend Big Data technology stack that would align with company's technology Passionate for continuous learning, experimenting, applying and contributing towards cutting edge open source technologies and software paradigms Drive significant technology initiatives end to end and across multiple layers of architecture Provides strong technical leadership in adopting and contributing to open source technologies related to BigData across the company. Provide strong technical expertise (performance, application design, stack upgrades) to lead Platform Engineering Defines and Drives best practices that can be adopted in BigData stack. Evangelizes the best practices across teams and BUs. Drives operational excellence through root cause analysis and continuous improvement for BigData technologies and processes and contributes back to open source community. Provide technical leadership and be a role model to data engineers pursuing technical career path in engineering Provide/inspire innovations that fuel the growth of Fractal as a whole EXPERIENCE : Must Have : Ideally, This Would Include Work On The Following Technologies Expert-level proficiency in at-least one of Java, C++ or Python (preferred). Scala knowledge a strong advantage. Strong understanding and experience in distributed computing frameworks, particularly Apache Hadoop 2.0 (YARN; MR & HDFS) and associated technologies -- one or more of Hive, Sqoop, Avro, Flume, Oozie, Zookeeper, etc.Hands-on experience with Apache Spark and its components (Streaming, SQL, MLLib) is a strong advantage. Operating knowledge of cloud computing platforms (AWS, especially EMR, EC2, S3, SWF services and the AWS CLI) Experience working within a Linux computing environment, and use of command line tools including knowledge of shell/Python scripting for automating common tasks Ability to work in a team in an agile setting, familiarity with JIRA and clear understanding of how Git works. A technologist - Loves to code and design In addition, the ideal candidate would have great problem-solving skills, and the ability & confidence to hack their way out of tight corners. Relevant Experience : Java or Python or C++ expertise Linux environment and shell scripting Distributed computing frameworks (Hadoop or Spark) Cloud computing platforms (AWS) Good to have : Statistical or machine learning DSL like R Distributed and low latency (streaming) application architecture Row store distributed DBMSs such as Cassandra Familiarity with API design Qualification:​​​​​​​ B.E/B.Tech/M.Tech in Computer Science or related technical degree OR Equivalent

Job posted by
apply for job
apply for job
Jesvin Varghese picture
Jesvin Varghese
Job posted by
Jesvin Varghese picture
Jesvin Varghese
Apply for job
apply for job

Big Data Engineer

Founded 2015
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Navi Mumbai, Noida, NCR (Delhi | Gurgaon | Noida)
Experience icon
1 - 2 years
Experience icon
Best in industry4 - 10 lacs/annum

Job Requirement Installation, configuration and administration of Big Data components (including Hadoop/Spark) for batch and real-time analytics and data hubs Capable of processing large sets of structured, semi-structured and unstructured data Able to assess business rules, collaborate with stakeholders and perform source-to-target data mapping, design and review. Familiar with data architecture for designing data ingestion pipeline design, Hadoop information architecture, data modeling and data mining, machine learning and advanced data processing Optional - Visual communicator ability to convert and present data in an easy comprehensible visualization using tools like D3.js, Tableau To enjoy being challenged, solve complex problems on a daily basis Proficient in executing efficient and robust ETL workflows To be able to work in teams and collaborate with others to clarify requirements To be able to tune Hadoop solutions to improve performance and end-user experience To have strong co-ordination and project management skills to handle complex projects Engineering background

Job posted by
apply for job
apply for job
Sneha Pandey picture
Sneha Pandey
Job posted by
Sneha Pandey picture
Sneha Pandey
Apply for job
apply for job

Senior Software Engineer

Founded 2013
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
NCR (Delhi | Gurgaon | Noida)
Experience icon
4 - 6 years
Experience icon
Best in industry15 - 18 lacs/annum

Requirements: Minimum 4-years work experience in building, managing and maintaining Analytics applications B.Tech/BE in CS/IT from Tier 1/2 Institutes Strong Fundamentals of Data Structures and Algorithms Good analytical & problem-solving skills Strong hands-on experience in Python In depth Knowledge of queueing systems (Kafka/ActiveMQ/RabbitMQ) Experience in building Data pipelines & Real time Analytics Systems Experience in SQL (MYSQL) & NoSQL (Mongo/Cassandra) databases is a plus Understanding of Service Oriented Architecture Delivered high-quality work with a significant contribution Expert in git, unit tests, technical documentation and other development best practices Experience in Handling small teams

Job posted by
apply for job
apply for job
tanika monga picture
tanika monga
Job posted by
tanika monga picture
tanika monga
Apply for job
apply for job

Machine learning Developer

Founded 2017
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
NCR (Delhi | Gurgaon | Noida)
Experience icon
1 - 5 years
Experience icon
Best in industry15 - 16 lacs/annum

We are looking for a Machine Learning Developer who possesses apassion for machine technology & big data and will work with nextgeneration Universal IoT platform.Responsibilities:•Design and build machine that learns , predict and analyze data.•Build and enhance tools to mine data at scale• Enable the integration of Machine Learning models in Chariot IoTPlatform•Ensure the scalability of Machine Learning analytics across millionsof networked sensors•Work with other engineering teams to integrate our streaming,batch, or ad-hoc analysis algorithms into Chariot IoT's suite ofapplications•Develop generalizable APIs so other engineers can use our workwithout needing to be a machine learning expert

Job posted by
apply for job
apply for job
Raj Garg picture
Raj Garg
Job posted by
Raj Garg picture
Raj Garg
Apply for job
apply for job

Artificial Intelligence Developers

Founded 2016
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
NCR (Delhi | Gurgaon | Noida)
Experience icon
1 - 3 years
Experience icon
Best in industry3 - 9 lacs/annum

-Precily AI: Automatic summarization, shortening a business document, book with our AI. Create a summary of the major points of the original document. AI can make a coherent summary taking into account variables such as length, writing style, and syntax. We're also working in the legal domain to reduce the high number of pending cases in India. We use Artificial Intelligence and Machine Learning capabilities such as NLP, Neural Networks in Processing the data to provide solutions for various industries such as Enterprise, Healthcare, Legal.

Job posted by
apply for job
apply for job
Bharath Rao picture
Bharath Rao
Job posted by
Bharath Rao picture
Bharath Rao
Apply for job
apply for job

Big Data Evangelist

Founded 2016
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Noida, Hyderabad, NCR (Delhi | Gurgaon | Noida)
Experience icon
2 - 6 years
Experience icon
Best in industry4 - 12 lacs/annum

Looking for a technically sound and excellent trainer on big data technologies. Get an opportunity to become popular in the industry and get visibility. Host regular sessions on Big data related technologies and get paid to learn.

Job posted by
apply for job
apply for job
Suchit Majumdar picture
Suchit Majumdar
Job posted by
Suchit Majumdar picture
Suchit Majumdar
Apply for job
apply for job

Freelance Faculty

Founded 2009
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Anywhere, United States, Canada
Experience icon
3 - 10 years
Experience icon
Best in industry2 - 10 lacs/annum

To introduce myself I head Global Faculty Acquisition for Simplilearn. About My Company: SIMPLILEARN is a company which has transformed 500,000+ carriers across 150+ countries with 400+ courses and yes we are a Registered Professional Education Provider providing PMI-PMP, PRINCE2, ITIL (Foundation, Intermediate & Expert), MSP, COBIT, Six Sigma (GB, BB & Lean Management), Financial Modeling with MS Excel, CSM, PMI - ACP, RMP, CISSP, CTFL, CISA, CFA Level 1, CCNA, CCNP, Big Data Hadoop, CBAP, iOS, TOGAF, Tableau, Digital Marketing, Data scientist with Python, Data Science with SAS & Excel, Big Data Hadoop Developer & Administrator, Apache Spark and Scala, Tableau Desktop 9, Agile Scrum Master, Salesforce Platform Developer, Azure & Google Cloud. : Our Official website : www.simplilearn.com If you're interested in teaching, interacting, sharing real life experiences and passion to transform Careers, please join hands with us. Onboarding Process • Updated CV needs to be sent to my email id , with relevant certificate copy. • Sample ELearning access will be shared with 15days trail post your registration in our website. • My Subject Matter Expert will evaluate you on your areas of expertise over a telephonic conversation - Duration 15 to 20 minutes • Commercial Discussion. • We will register you to our on-going online session to introduce you to our course content and the Simplilearn style of teaching. • A Demo will be conducted to check your training style, Internet connectivity. • Freelancer Master Service Agreement Payment Process : • Once a workshop/ Last day of the training for the batch is completed you have to share your invoice. • An automated Tracking Id will be shared from our automated ticketing system. • Our Faculty group will verify the details provided and share the invoice to our internal finance team to process your payment, if there are any additional information required we will co-ordinate with you. • Payment will be processed in 15 working days as per the policy this 15 days is from the date of invoice received. Please share your updated CV to get this for next step of on-boarding process.

Job posted by
apply for job
apply for job
STEVEN JOHN picture
STEVEN JOHN
Job posted by
STEVEN JOHN picture
STEVEN JOHN
Apply for job
apply for job

Data Scientist

Founded 2016
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
NCR (Delhi | Gurgaon | Noida)
Experience icon
3 - 6 years
Experience icon
Best in industry5 - 8 lacs/annum

Transporter is an AI-enabled location stack that helps companies improve their commerce, engagement or operations through their mobile apps for the next generation of online commerce

Job posted by
apply for job
apply for job
Shailendra Singh picture
Shailendra Singh
Job posted by
Shailendra Singh picture
Shailendra Singh
Apply for job
apply for job

freelance trainers

Founded 2015
Products and services{{j_company_types[2 - 1]}}
{{j_company_sizes[1 - 1]}} employees
{{j_company_stages[1 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Anywhere
Experience icon
8 - 11 years
Experience icon
Best in industry5 - 10 lacs/annum

We are a team with a mission, A mission to create and deliver great learning experiences to engineering students through various workshops and courses. If you are an industry professional and :- See great scope of improvement in higher technical education across the country and connect with our purpose of impacting it for good. Keen on sharing your technical expertise to enhance the practical learning of students. You are innovative in your ways of creating content and delivering them. You don’t mind earning few extra bucks while doing this in your free time. Buzz us at info@monkfox.com and let us discuss how together we can take technological education in the country to new heights.

Job posted by
apply for job
apply for job
Tanu Mehra picture
Tanu Mehra
Job posted by
Tanu Mehra picture
Tanu Mehra
Apply for job
apply for job
Why apply via CutShort?
Connect with actual hiring teams and get their fast response. No 3rd party recruiters. No spam.