

Deqode
http://deqode.comJobs at Deqode



Location: Mumbai (On-site)
Experience: 2+ Years
Joining: Immediate Joiner Preferred
Employment Type: Full-time
🔍 Key Responsibilities:
- Develop and maintain web applications using PHP and CodeIgniter framework.
- Build and manage dynamic frontends using Angular.
- Integrate RESTful APIs and ensure seamless communication between backend and frontend.
- Optimize application for maximum speed, performance, and scalability.
- Collaborate with cross-functional teams for end-to-end product development.
- Debug and resolve application issues and bugs.
✅ Must-Have Skills:
- Strong experience with PHP and CodeIgniter.
- Proficient in Angular (any version).
- Solid understanding of HTML5, CSS3, JavaScript, and TypeScript.
- Knowledge of MySQL or other relational databases.
- Familiar with Git version control and RESTful API integration.

Role: GCP Data Engineer
Notice Period: Immediate Joiners
Experience: 5+ years
Location: Remote
Company: Deqode
About Deqode
At Deqode, we work with next-gen technologies to help businesses solve complex data challenges. Our collaborative teams build reliable, scalable systems that power smarter decisions and real-time analytics.
Key Responsibilities
- Build and maintain scalable, automated data pipelines using Python, PySpark, and SQL.
- Work on cloud-native data infrastructure using Google Cloud Platform (BigQuery, Cloud Storage, Dataflow).
- Implement clean, reusable transformations using DBT and Databricks.
- Design and schedule workflows using Apache Airflow.
- Collaborate with data scientists and analysts to ensure downstream data usability.
- Optimize pipelines and systems for performance and cost-efficiency.
- Follow best software engineering practices: version control, unit testing, code reviews, CI/CD.
- Manage and troubleshoot data workflows in Linux environments.
- Apply data governance and access control via Unity Catalog or similar tools.
Required Skills & Experience
- Strong hands-on experience with PySpark, Spark SQL, and Databricks.
- Solid understanding of GCP services (BigQuery, Cloud Functions, Dataflow, Cloud Storage).
- Proficiency in Python for scripting and automation.
- Expertise in SQL and data modeling.
- Experience with DBT for data transformations.
- Working knowledge of Airflow for workflow orchestration.
- Comfortable with Linux-based systems for deployment and troubleshooting.
- Familiar with Git for version control and collaborative development.
- Understanding of data pipeline optimization, monitoring, and debugging.

Role - MLops Engineer
Required Experience - 4 Years
Location - Pune, Gurgaon, Noida, Bhopal, Bangalore
Mode - Hybrid
Key Requirements:
- 4+ years of experience in Software Engineering with MLOps focus
- Strong expertise in AWS, particularly AWS SageMaker (required)
- AWS Data Zone experience (preferred)
- Proficiency in Python, R, Scala, or Spark
- Experience developing scalable, reliable, and secure applications
- Track record of production-grade development, integration and support

We are looking for a skilled and passionate Data Engineers with a strong foundation in Python programming and hands-on experience working with APIs, AWS cloud, and modern development practices. The ideal candidate will have a keen interest in building scalable backend systems and working with big data tools like PySpark.
Key Responsibilities:
- Write clean, scalable, and efficient Python code.
- Work with Python frameworks such as PySpark for data processing.
- Design, develop, update, and maintain APIs (RESTful).
- Deploy and manage code using GitHub CI/CD pipelines.
- Collaborate with cross-functional teams to define, design, and ship new features.
- Work on AWS cloud services for application deployment and infrastructure.
- Basic database design and interaction with MySQL or DynamoDB.
- Debugging and troubleshooting application issues and performance bottlenecks.
Required Skills & Qualifications:
- 4+ years of hands-on experience with Python development.
- Proficient in Python basics with a strong problem-solving approach.
- Experience with AWS Cloud services (EC2, Lambda, S3, etc.).
- Good understanding of API development and integration.
- Knowledge of GitHub and CI/CD workflows.
- Experience in working with PySpark or similar big data frameworks.
- Basic knowledge of MySQL or DynamoDB.
- Excellent communication skills and a team-oriented mindset.
Nice to Have:
- Experience in containerization (Docker/Kubernetes).
- Familiarity with Agile/Scrum methodologies.

Be hands-on with describing business rules as technical artifacts for analysts, business product managers, and any functional team that depends on this information
Proficiency with full stack development preferably React, Java, springboot, python. Perform unit tests, resolve bugs and be responsible for the delivery of the project to production
Be a thought leader in bringing the GenAI capabilities to Visa's product development
Be comfortable looking at system logs when needed and identifying bugs when they might appear during course of analysis
Discuss technical implementation details with architects and other technical teams
Collaborate with engineering teams to build robust, reliable, and scalable platforms
Synthesize requirements into a common set of platform and services capabilities
Candidates will be responsible for value proposition definition, communications across technical, business and product functions, cost/benefit analysis, and executive presentations
Partner with business operations and analytics team to build right monitoring capabilities and metrics which helps monitor the health of the platform and facilitates data driven decision making
Define product requirements which involve writing clear, thorough, and detailed user stories for complex features and product sets
Be the primary liaison between business units, business product managers, engineering, and other applicable groups to ensure flawless integration of new and existing feature sets
Highly execution focused by working in conjunction with various counterparts across functions
Ensure product vision and requirements are designed with long-term vision in mind and can be launched globally with minimal tweaks
Work closely with senior leadership providing status, action plan and recommendations
We are hiring a Java Production Support Engineer with hands-on experience in Java, Spring Boot, and Splunk. You’ll be responsible for ensuring smooth functioning of our production systems by proactively monitoring, troubleshooting, and resolving issues in real time.
Key Responsibilities
- Provide L2/L3 support for Java/Spring Boot-based applications in production.
- Monitor logs and system health using Splunk and other observability tools.
- Troubleshoot performance, latency, and availability issues.
- Perform root cause analysis (RCA) and work with dev teams for permanent fixes.
- Support incident and change management processes.
- Create and maintain runbooks and support documentation.
Must-Have Skills
- Strong knowledge of Java, Spring Boot, and REST APIs.
- Proficient in using Splunk for log analysis and monitoring.
- Good understanding of relational databases (SQL) and Linux commands.
- Experience with CI/CD tools (Jenkins, Git) and basic scripting (Bash, Python).
- Analytical thinking and excellent problem-solving skills.

Roles and Responsibilities:
- Build scalable and loosely coupled services to extend our platform
- Build bulletproof API integrations with third-party APIs for various use cases
- Evolve our Infrastructure and add a few more nines to our overall availability
- Have full autonomy and own your code, and decide on the technologies and tools to deliver as well operate large-scale applications on AWS
- Give back to the open-source community through contributions on code and blog posts
- This is a startup so everything can change as we experiment with more product improvements
Some specific Requirements:
- Atleast 2+ years of Development Experience
- You have prior experience developing and working on consumer-facing web/app products
- Hands-on experience in JavaScript. Exceptions can be made if you’re really good at any other language with experience in building web/app-based tech products
- Expertise in Node.JS and Experience in at least one of the following frameworks - Express.js, Koa.js, Socket.io (http://socket.io/)
- Good knowledge of async programming using Callbacks, Promises, and Async/Await
- Hands-on experience with Frontend codebases using HTML, CSS, and AJAX
- Working knowledge of MongoDB, Redis, MySQL
- Good understanding of Data Structures, Algorithms, and Operating Systems
- You've worked with AWS services in the past and have experience with EC2, ELB, AutoScaling, CloudFront, S3
- Experience with Frontend Stack would be added advantage (HTML, CSS)
- You might not have experience with all the tools that we use but you can learn those given the guidance and resources
- Experience in Vue.js would be plus

Requirements:
- Must have proficiency in Python
- At least 3+ years of professional experience in software application development.
- Good understanding of REST APIs and a solid experience in testing APIs.
- Should have built APIs at some point and practical knowledge on working with them
- Must have experience in API testing tools like Postman and in setting up the prerequisites and post-execution validations using these tools
- Ability to develop applications for test automation
- Should have worked in a distributed micro-service environment
- Hands-on experience with Python packages for testing (preferably pytest).
- Should be able to create fixtures, mock objects and datasets that can be used by tests across different micro-services
- Proficiency in gitStrong in writing SQL queriesTools like Jira, Asana or similar bug tracking tool, Confluence - Wiki, Jenkins - CI tool
- Excellent written and oral communication and organisational skills with the ability to work within a growing company with increasing needs
- Proven track record of ability to handle time-critical projects
Good to have:
- Good understanding of CI/CDKnowledge of queues, especially Kafka
- Ability to independently manage test environment deployments and handle issues around itPerformed load testing of API endpoints
- Should have built an API test automation framework from scratch and maintained it
- Knowledge of cloud platforms like AWS, Azure
- Knowledge of different browsers and cross-platform operating systems
- Knowledge of JavaScript
- Web Programming, Docker & 3-Tier Architecture Knowledge is preferred.
- Should have knowlege in API Creation, Coding Experience would be add on.
- 5+ years experience in test automation using tools like TestNG, Selenium Webdriver (Grid, parallel, SauceLabs), Mocha_Chai front-end and backend test automation
- Bachelor's degree in Computer Science / IT / Computer Applications
Profile: Salesforce Developer
Experience: 4+ years
Location: Pune, Nagpur, Indore, Bangalore (Remote options available)
Salary: Up to 15 LPA
Job Description
We're looking for a skilled Salesforce Developer with strong experience in API development, integrations, and Lightning Web Components (LWC). The ideal candidate will have hands-on expertise with Salesforce Service Cloud and Data Cloud implementations.
Requirements
- 4+ years of Salesforce development experience
- Expert knowledge of Salesforce APIs and integration patterns
- Proficiency in Lightning Web Components (LWC) development
- Experience implementing and configuring Service Cloud solutions
- Hands-on experience with Salesforce Data Cloud
- Experience with data migration and ETL processes
- Ability to design and implement scalable Salesforce solutions
- Strong problem-solving abilities and attention to detail
- Excellent communication skills
Preferred Qualifications
- Salesforce certifications (Platform Developer I/II, Application Architect)
- Experience with Apex, SOQL, and Lightning Design System
- Background in middleware tools and REST/SOAP web services
- Knowledge of JavaScript frameworks
Must be:
- Based in Mumbai
- Comfortable with Work from Office
- Available to join immediately
Responsibilities:
- Manage, monitor, and scale production systems across cloud (AWS/GCP) and on-prem.
- Work with Kubernetes, Docker, Lambdas to build reliable, scalable infrastructure.
- Build tools and automation using Python, Go, or relevant scripting languages.
- Ensure system observability using tools like NewRelic, Prometheus, Grafana, CloudWatch, PagerDuty.
- Optimize for performance and low-latency in real-time systems using Kafka, gRPC, RTP.
- Use Terraform, CloudFormation, Ansible, Chef, Puppet for infra automation and orchestration.
- Load testing using Gatling, JMeter, and ensuring fault tolerance and high availability.
- Collaborate with dev teams and participate in on-call rotations.
Requirements:
- B.E./B.Tech in CS, Engineering or equivalent experience.
- 3+ years in production infra and cloud-based systems.
- Strong background in Linux (RHEL/CentOS) and shell scripting.
- Experience managing hybrid infrastructure (cloud + on-prem).
- Strong testing practices and code quality focus.
- Experience leading teams is a plus.

Similar companies
About the company
Fractal is one of the most prominent players in the Artificial Intelligence space.Fractal's mission is to power every human decision in the enterprise and brings Al, engineering, and design to help the world's most admire Fortune 500® companies.
Fractal's products include Qure.ai to assist radiologists in making better diagnostic decisions, Crux Intelligence to assist CEOs and senior executives make better tactical and strategic decisions, Theremin.ai to improve investment decisions, Eugenie.ai to find anomalies in high-velocity data, Samya.ai to drive next-generation Enterprise Revenue Growth Manage- ment, Senseforth.ai to automate customer interactions at scale to grow top-line and bottom-line and Analytics Vidhya is the largest Analytics and Data Science community offering industry-focused training programs.
Fractal has more than 3600 employees across 16 global locations, including the United States, UK, Ukraine, India, Singapore, and Australia. Fractal has consistently been rated as India's best companies to work for, by The Great Place to Work® Institute, featured as a leader in Customer Analytics Service Providers Wave™ 2021, Computer Vision Consultancies Wave™ 2020 & Specialized Insights Service Providers Wave™ 2020 by Forrester Research, a leader in Analytics & Al Services Specialists Peak Matrix 2021 by Everest Group and recognized as an "Honorable Vendor" in 2022 Magic Quadrant™™ for data & analytics by Gartner. For more information, visit fractal.ai
Jobs
2
About the company
We Enable and empower our partners to engage their clients through smart technology. Use of technology innovations designed to squeeze out savings and efficiency from the current insurance industry model.The belief driving us is that the insurance industry is ripe for innovation and disruption. We offer ultra-customized tools and using new streams of data from internet-enabled devices to dynamically price premiums according to observed behavior.
Jobs
11
About the company
We build network solutions for the emerging Next Generation Central Office (NGCO) market.
We have re-applied design patterns from the hyper-scale world to Service Provider and cloud networks in order to faster implement new features into operational networks.
Our parallel modular architecture allows customers programmability, performance and scale to alter CAPEX and OPEX.
Jobs
5
About the company
Jobs
19
About the company
Jobs
5
About the company
Data Axle is a data-driven marketing solutions provider that helps clients with clean data, lead generation, strategy development, campaign design, and day-to-day execution needs. It solves the problem of inaccurate and incomplete data, enabling businesses to make informed decisions and drive growth. Data Axle operates in various industries, including healthcare, finance, retail, and technology.
About Data Axle:
Data Axle Inc. has been an industry leader in data, marketing solutions, sales, and research for over 50 years in the USA. Data Axle now has an established strategic global center of excellence in Pune. This center delivers mission
critical data services to its global customers powered by its proprietary cloud-based technology platform and by leveraging proprietary business and consumer databases.
Data Axle India is recognized as a Great Place to Work!
This prestigious designation is a testament to our collective efforts in fostering an exceptional workplace culture and creating an environment where every team member can thrive.
Jobs
5
About the company
Jobs
27
About the company
Sun King is a leading global provider of off-grid solar energy solutions, designed to serve the 1.8 billion people who lack reliable or affordable access to traditional electrical grids. With a mission to power brighter lives, the company focuses on underserved markets across Africa and Asia. Sun King's product range includes solar lanterns, solar home systems, and solar inverters, tailored to meet a variety of energy needs—from portable lighting to powering entire homes.
The company's innovative solutions, such as the recently launched PowerHub 3300 and expandable solar home systems, reflect their commitment to evolving customer demands. With operations in over 40 countries and millions of products sold, Sun King makes solar energy accessible through pay-as-you-go financing options. The company’s network of field agents plays a key role in selling, installing, and servicing products, driving local economic development. Rooted in sustainability, Sun King also implements a Sustainable Financing Framework and ensures customer satisfaction through extensive service centers and after-sales support.
Jobs
1
About the company
Jobs
8
About the company
We are an InsurTech start-up based out of Bangalore, with a focus on Healthcare. CoverSelf empowers healthcare insurance companies with a truly NEXT-GEN cloud-native, holistic & customizable platform preventing and adapting to the ever-evolving claims & payment inaccuracies. Reduce complexity and administrative costs with a unified healthcare dedicated platform.
Jobs
3