50+ Amazon S3 Jobs in India
Apply to 50+ Amazon S3 Jobs on CutShort.io. Find your next job, effortlessly. Browse Amazon S3 Jobs and apply today!


Job Title: Senior Node.js and Python Azure developer ( AWS to Azure Migration expert)
Experience: 7-10 Yrs.
Primary Skills:
Node.js and Python
Hands-on experience with Azure, Serverless (Azure Functions)
AWS to Azure Cloud Migration (Preferred)
Scope of Work:
- Hand-on experience in migration of Node.js and Python application from AWS to Azure environment
- Analyse source architecture, Source code and AWS service dependencies to identify code remediations scenarios.
- Perform code remediations/Refactoring and configuration changes required to deploy the application on Azure, including Azure service dependencies and other application dependencies remediations at source code.
- 7+ years of experience in application development with Node.js and Python
- Experience in Unit testing, application testing support and troubleshooting on Azure.
- Experience in application deployment scripts/pipelines, App service, APIM, AKS/Microservices/containerized apps, Kubernetes, helm charts.
- Hands-on experience in developing apps for AWS and Azure (Must Have)
- Hands-on experience with Azure services for application development (AKS, Azure Functions) and deployments.
- Understanding of Azure infrastructure services required for hosting applications on Azure PaaS or Serverless.
- Tech stack details:
- Confluent Kafka AWS S3 Sync connector
- Azure Blob Storage
- AWS lambda to Azure Functions (Serverless) – Python or Node.js
- NodeJS REST API
- S3 to Azure Blob Storage
- AWS to Azure SDK Conversion (Must Have)
Educational qualification:
B.E/B.Tech/MCA


Job Title : Python Developer – API Integration & AWS Deployment
Experience : 5+ Years
Location : Bangalore
Work Mode : Onsite
Job Overview :
We are seeking an experienced Python Developer with strong expertise in API development and AWS cloud deployment.
The ideal candidate will be responsible for building scalable RESTful APIs, automating power system simulations using PSS®E (psspy), and deploying automation workflows securely and efficiently on AWS.
Mandatory Skills : Python, FastAPI/Flask, PSS®E (psspy), RESTful API Development, AWS (EC2, Lambda, S3, EFS, API Gateway), AWS IAM, CloudWatch.
Key Responsibilities :
Python Development & API Integration :
- Design, build, and maintain RESTful APIs using FastAPI or Flask to interface with PSS®E.
- Automate simulations and workflows using the PSS®E Python API (psspy).
- Implement robust bulk case processing, result extraction, and automated reporting systems.
AWS Cloud Deployment :
- Deploy APIs and automation pipelines using AWS services such as EC2, Lambda, S3, EFS, and API Gateway.
- Apply cloud-native best practices to ensure reliability, scalability, and cost efficiency.
- Manage secure access control using AWS IAM, API keys, and implement monitoring using CloudWatch.
Required Skills :
- 5+ Years of professional experience in Python development.
- Hands-on experience with RESTful API development (FastAPI/Flask).
- Solid experience working with PSS®E and its psspy Python API.
- Strong understanding of AWS services, deployment, and best practices.
- Proficiency in automation, scripting, and report generation.
- Knowledge of cloud security and monitoring tools like IAM and CloudWatch.
Good to Have :
- Experience in power system simulation and electrical engineering concepts.
- Familiarity with CI/CD tools for AWS deployments.

Job Title : Full Stack Drupal Developer
Experience : Minimum 5 Years
Location : Hyderabad / Bangalore / Mumbai / Pune / Chennai / Gurgaon (Hybrid or On-site)
Notice Period : Immediate to 15 Days Preferred
Job Summary :
We are seeking a skilled and experienced Full Stack Drupal Developer with a strong background in Drupal (version 8 and above) for both front-end and back-end development. The ideal candidate will have hands-on experience in AWS deployments, Drupal theming and module development, and a solid understanding of JavaScript, PHP, and core Drupal architecture. Acquia certifications and contributions to the Drupal community are highly desirable.
Mandatory Skills :
Drupal 8+, PHP, JavaScript, Custom Module & Theming Development, AWS (EC2, Lightsail, S3, CloudFront), Acquia Certified, Drupal Community Contributions.
Key Responsibilities :
- Develop and maintain full-stack Drupal applications, including both front-end (theming) and back-end (custom module) development.
- Deploy and manage Drupal applications on AWS using services like EC2, Lightsail, S3, and CloudFront.
- Work with the Drupal theming layer and module layer to build custom and reusable components.
- Write efficient and scalable PHP code integrated with JavaScript and core JS concepts.
- Collaborate with UI/UX teams to ensure high-quality user experiences.
- Optimize performance and ensure high availability of applications in cloud environments.
- Contribute to the Drupal community and utilize contributed modules effectively.
- Follow best practices for code versioning, documentation, and CI/CD deployment processes.
Required Skills & Qualifications :
- Minimum 5 Years of hands-on experience in Drupal development (Drupal 8 onwards).
- Strong experience in front-end (theming, JavaScript, HTML, CSS) and back-end (custom module development, PHP).
- Experience with Drupal deployment on AWS, including services such as EC2, Lightsail, S3, and CloudFront.
- Proficiency in JavaScript, core JS concepts, and PHP coding.
- Acquia certifications such as:
- Drupal Developer Certification
- Site Management Certification
- Acquia Certified Developer (preferred)
- Experience with contributed modules and active participation in the Drupal community is a plus.
- Familiarity with version control (Git), Agile methodologies, and modern DevOps tools.
Preferred Certifications :
- Acquia Certified Developer.
- Acquia Site Management Certification.
- Any relevant AWS certifications are a bonus.

We are looking for a Senior Software Engineer with 5+ years of experience in modern C++ development, paired with strong hands-on skills in AWS, Node.js, data processing, and containerized service development. The ideal candidate will be responsible for building scalable systems, maintaining complex data pipelines, and modernizing applications through cloud-native approaches and automation.
This is a high-impact role where engineering depth meets platform evolution, ideal for someone who thrives in system-level thinking, data-driven applications, and full-stack delivery.
Key Responsibilities:
- Design, build, and maintain high-performance systems using modern C++
- Develop and deploy scalable backend services using Node.js and manage dependencies via NPM
- Architect and implement containerized services using Docker, with orchestration via Kubernetes or ECS
- Build, monitor, and maintain data ingestion, transformation, and enrichment pipelines
- Utilize AWS services (Lambda, EC2, S3, CloudWatch, Step Functions) to deliver reliable cloud-native solutions
- Implement and maintain modern CI/CD pipelines, ensuring seamless integration, testing, and delivery
- Participate in system design, peer code reviews, and performance tuning
Required Skills:
- 5+ years of software development experience, with strong command over modern C++
- Solid experience with Node.js, JavaScript, and NPM for backend development
- Deep understanding of cloud platforms (preferably AWS) and hands-on experience in deploying and managing applications in the cloud
- Proficient in building and scaling data processing workflows and working with structured/unstructured data
- Strong hands-on experience with Docker, container orchestration, and microservices architecture
- Working knowledge of CI/CD practices, Git, and build/release tools
- Strong problem-solving, debugging, and cross-functional collaboration skills
Preferred / Nice to Have:
- Exposure to data streaming frameworks (Kafka, Spark, etc.)
- Familiarity with monitoring and observability tools (e.g., Prometheus, Grafana, ELK stack)
- Background in performance profiling, secure coding, or legacy modernization
- Ability to work in agile environments and lead small technical initiatives
Job description
● Design effective, scalable architectures on top of cloud technologies such as AWS and Kubernetes
● Mentor other software engineers, including actively participating in peer code and architecture review
● Participate in all parts of the development lifecycle from design to coding to deployment to maintenance and operations
● Kickstart new ideas, build proof of concepts and jumpstart newly funded projects
● Demonstrate ability to work independently with minimal supervision
● Embed with other engineering teams on challenging initiatives and time sensitive projects
● Collaborate with other engineering teams on challenging initiatives and time sensitive projects
Education and Experience
● BS degree in Computer Science or related technical field or equivalent practical experience.
● 9+ years of professional software development experience focused on payments and/or billing and customer accounts. Worked with worldwide payments, billing systems, PCI Compliance & payment gateways.
Technical and Functional
● Extensive knowledge of micro service development using Spring, Spring Boot, Java - built on top of Kubernetes and public cloud computing such as AWS, Lambda, S3.
● Experience with relational databases (MySQL, DB2 or Oracle) and NoSQL databases
● Experience with unit testing and test driven development
Technologies at Constant Contact
Working on the Constant Contact platform provides our engineers with an opportunity to produce high impact work inside of our multifaceted platform (Email, Social, SMS, E-Commerce, CRM, Customer Data Platform, MLBased Recommendations & Insights, and more).
As a member of our team, you'll be utilizing the latest technologies and frameworks (React/SPA, JavaScript/TypeScript, Swift, Kotlin, GraphQL, etc) and deploying code to our cloud-first microservice infrastructure (declarative CI/CD, GitOps managed kubernetes) with regular opportunities to level up your skills.
● Past experience of working with and integrating payment gateways and processors, online payment methods, and billing systems.
● Familiar with integrating Stripe/Plaid/PayPal/Adyen/Cybersource or similar systems along with PCI compliance.
● International software development and payments experience is a plus.
● Knowledge of DevOps and CI/CD, automated test and build tools ( Jenkins & Gradle/Maven)
● Experience integrating with sales tax engines is a plus.
● Familiar with tools like Splunk, New relic or similar tools like datadog, elastic elk, amazon
cloudwatch.
● Good to have - Experience with React, Backbone, Marionette or other front end frameworks.
Cultural
● Strong verbal and written communication skills.
● Flexible attitude and willingness to frequently move between different teams, software architectures and priorities.
● Desire to collaborate with our other product teams to think strategically about how to solve problems.
Our team
● We focus on cross-functional team collaboration where engineers, product managers, and designers all work together to solve customer problems and build exciting features.
● We love new ideas and are eager to see what your experiences can bring to help influence our technical and product vision.
● Collaborate/Overlap with the teams in Eastern Standard Time (EST), USA.

About the job
Location: Bangalore, India
Job Type: Full-Time | On-Site
Job Description
We are looking for a highly skilled and motivated Python Backend Developer to join our growing team in Bangalore. The ideal candidate will have a strong background in backend development with Python, deep expertise in relational databases like MySQL, and hands-on experience with AWS cloud infrastructure.
Key Responsibilities
- Design, develop, and maintain scalable backend systems using Python.
- Architect and optimize relational databases (MySQL), including complex queries and indexing.
- Manage and deploy applications on AWS cloud services (EC2, S3, RDS, DynamoDB, API Gateway, Lambda).
- Automate cloud infrastructure using CloudFormation or Terraform.
- Collaborate with cross-functional teams to define, design, and ship new features.
- Mentor junior developers and contribute to a culture of technical excellence.
- Proactively identify issues and provide solutions to challenging backend problems.
Mandatory Requirements
- Minimum 3 years of professional experience in Python backend development.
- Expert-level knowledge in MySQL database creation, optimization, and query writing.
- Strong experience with AWS services, particularly EC2, S3, RDS, DynamoDB, API Gateway, and Lambda.
- Hands-on experience with infrastructure as code using CloudFormation or Terraform.
- Proven problem-solving skills and the ability to work independently.
- Demonstrated leadership abilities and team collaboration skills.
- Excellent verbal and written communication.
Hi Kirti,
Job Title: Data Analytics Engineer
Experience: 3 to 6 years
Location: Gurgaon (Hybrid)
Employment Type: Full-time
Job Description:
We are seeking a highly skilled Data Analytics Engineer with expertise in Qlik Replicate, Qlik Compose, and Data Warehousing to build and maintain robust data pipelines. The ideal candidate will have hands-on experience with Change Data Capture (CDC) pipelines from various sources, an understanding of Bronze, Silver, and Gold data layers, SQL querying for data warehouses like Amazon Redshift, and experience with Data Lakes using S3. A foundational understanding of Apache Parquet and Python is also desirable.
Key Responsibilities:
1. Data Pipeline Development & Maintenance
- Design, develop, and maintain ETL/ELT pipelines using Qlik Replicate and Qlik Compose.
- Ensure seamless data replication and transformation across multiple systems.
- Implement and optimize CDC-based data pipelines from various source systems.
2. Data Layering & Warehouse Management
- Implement Bronze, Silver, and Gold layer architectures to optimize data workflows.
- Design and manage data pipelines for structured and unstructured data.
- Ensure data integrity and quality within Redshift and other analytical data stores.
3. Database Management & SQL Development
- Write, optimize, and troubleshoot complex SQL queries for data warehouses like Redshift.
- Design and implement data models that support business intelligence and analytics use cases.
4. Data Lakes & Storage Optimization
- Work with AWS S3-based Data Lakes to store and manage large-scale datasets.
- Optimize data ingestion and retrieval using Apache Parquet.
5. Data Integration & Automation
- Integrate diverse data sources into a centralized analytics platform.
- Automate workflows to improve efficiency and reduce manual effort.
- Leverage Python for scripting, automation, and data manipulation where necessary.
6. Performance Optimization & Monitoring
- Monitor data pipelines for failures and implement recovery strategies.
- Optimize data flows for better performance, scalability, and cost-effectiveness.
- Troubleshoot and resolve ETL and data replication issues proactively.
Technical Expertise Required:
- 3 to 6 years of experience in Data Engineering, ETL Development, or related roles.
- Hands-on experience with Qlik Replicate & Qlik Compose for data integration.
- Strong SQL expertise, with experience in writing and optimizing queries for Redshift.
- Experience working with Bronze, Silver, and Gold layer architectures.
- Knowledge of Change Data Capture (CDC) pipelines from multiple sources.
- Experience working with AWS S3 Data Lakes.
- Experience working with Apache Parquet for data storage optimization.
- Basic understanding of Python for automation and data processing.
- Experience in cloud-based data architectures (AWS, Azure, GCP) is a plus.
- Strong analytical and problem-solving skills.
- Ability to work in a fast-paced, agile environment.
Preferred Qualifications:
- Experience in performance tuning and cost optimization in Redshift.
- Familiarity with big data technologies such as Spark or Hadoop.
- Understanding of data governance and security best practices.
- Exposure to data visualization tools such as Qlik Sense, Tableau, or Power BI.
We are seeking a skilled and motivated Software Engineer with over 3 years of experience in designing and developing web-based applications using Node.js.
Key Responsibilities
- Design, develop, and maintain web-based applications using Node.js.
- Build scalable, high-performance RESTful APIs using Express.js or Restify frameworks.
- Develop and maintain robust SQL database systems, leveraging Sequelize ORM.
- Ensure responsiveness of applications across various devices and platforms.
- Collaborate with cross-functional teams during the product development lifecycle, including prototyping, hardening, and testing phases.
- Work with real-time communication technologies and ensure seamless integration.
- Learn and adapt to alternative technologies as needed to meet project requirements.
Required Skills & Experience
- 3+ years of experience in web application development using Node.js.
- Proficiency with frameworks such as Express.js or Restify.
- Strong expertise in SQL databases and experience with Sequelize ORM.
- In-depth understanding of JavaScript, browser technologies, and real-time communication.
- Hands-on experience in developing responsive web applications.
- Experience with React Native (a plus).
- Proficiency in Java.
- Familiarity with product development lifecycle, including prototyping, testing, and deployment.
Additional Skills & Experience
- Experience with NoSQL databases such as MongoDB or Cassandra.
- Knowledge of internationalization (i18n) and latest UI/UX design trends.
- Familiarity with JavaScript libraries/frameworks like ReactJS or VueJS.
- Experience integrating payment gateways for various countries.
- Strong communication skills and ability to facilitate group discussions effectively.
- Eagerness to contribute to product functionality and user experience designs.
Education Requirements
- Bachelor's or Master's degree in Computer Science or a related field.

Job Title : Sr. Data Engineer
Experience : 5+ Years
Location : Noida (Hybrid – 3 Days in Office)
Shift Timing : 2-11 PM
Availability : Immediate
Job Description :
- We are seeking a Senior Data Engineer to design, develop, and optimize data solutions.
- The role involves building ETL pipelines, integrating data into BI tools, and ensuring data quality while working with SQL, Python (Pandas, NumPy), and cloud platforms (AWS/GCP).
- You will also develop dashboards using Looker Studio and work with AWS services like S3, Lambda, Glue ETL, Athena, RDS, and Redshift.
- Strong debugging, collaboration, and communication skills are essential.
Job Title : Senior AWS Data Engineer
Experience : 5+ Years
Location : Gurugram
Employment Type : Full-Time
Job Summary :
Seeking a Senior AWS Data Engineer with expertise in AWS to design, build, and optimize scalable data pipelines and data architectures. The ideal candidate will have experience in ETL/ELT, data warehousing, and big data technologies.
Key Responsibilities :
- Build and optimize data pipelines using AWS (Glue, EMR, Redshift, S3, etc.).
- Maintain data lakes & warehouses for analytics.
- Ensure data integrity through quality checks.
- Collaborate with data scientists & engineers to deliver solutions.
Qualifications :
- 7+ Years in Data Engineering.
- Expertise in AWS services, SQL, Python, Spark, Kafka.
- Experience with CI/CD, DevOps practices.
- Strong problem-solving skills.
Preferred Skills :
- Experience with Snowflake, Databricks.
- Knowledge of BI tools (Tableau, Power BI).
- Healthcare/Insurance domain experience is a plus.
Good understanding and experience of HTML / CSS / JavaScript.
Hands-on experience with ES6 / ES7 / ES8 features.
Thorough understanding of the Request Lifecycle (including Event Queue, Event Loop,
Worker Threads, etc).
Familiarity with security principles including SSL protocols, data encryption, XSS, CSRF.
Expertise in Web Services / REST APIs will be beneficial.
Proficiency in Linux and deployment on Linux are valuable.
Knowledge about ORM like Sequelize and ODM like Mongoose and the ability to handle
DB transactions is a necessity.
Experience with Angular JS / React JS will be an added advantage.
Expertise with RDBMS like MySQL / PostgreSQL will be a plus.
Knowledge of AWS services like S3, EC2 will be helpful.
Understanding of Agile and CI/CD will be of value.
Job Title : Tech Lead - Data Engineering (AWS, 7+ Years)
Location : Gurugram
Employment Type : Full-Time
Job Summary :
Seeking a Tech Lead - Data Engineering with expertise in AWS to design, build, and optimize scalable data pipelines and data architectures. The ideal candidate will have experience in ETL/ELT, data warehousing, and big data technologies.
Key Responsibilities :
- Build and optimize data pipelines using AWS (Glue, EMR, Redshift, S3, etc.).
- Maintain data lakes & warehouses for analytics.
- Ensure data integrity through quality checks.
- Collaborate with data scientists & engineers to deliver solutions.
Qualifications :
- 7+ Years in Data Engineering.
- Expertise in AWS services, SQL, Python, Spark, Kafka.
- Experience with CI/CD, DevOps practices.
- Strong problem-solving skills.
Preferred Skills :
- Experience with Snowflake, Databricks.
- Knowledge of BI tools (Tableau, Power BI).
- Healthcare/Insurance domain experience is a plus.
BACKEND DEVELOPER JOB DESCRIPTION
Job Title: Backend Developer - Node.js & MongoDB
Location: Hyderabad
Employment Type: Full-Time
Experience Required: 3–5 Years
About Us
Inncircles – THE INNGINEERING COMPANY
We are a forward-thinking construction-tech innovator building CRM solutions that manage crores of records with precision and speed. Our mission is to revolutionize the construction domain through scalable engineering and robust backend systems. Join us to solve complex challenges and shape the future of data-driven construction tech!
Job Description
We are hiring a Backend Developer with 3–5 years of hands-on experience in Node.js and MongoDB to design, optimize, and maintain high-performance backend systems. You will work on large-scale data processing, external integrations, and scalable architectures while ensuring best coding practices and efficient database design.
Key Responsibilities
Backend Development & Optimization
- Develop and maintain RESTful/GraphQL APIs using Node.js, adhering to best coding practices and reusable code structures.
- Write optimized MongoDB queries for collections with crores of records, ensuring efficient data retrieval and storage.
- Design MongoDB collections, implement indexing strategies, and optimize replica sets for performance and reliability.
- Scalability & Performance
- Implement horizontal and vertical scaling strategies to handle growing data and traffic.
- Optimize database performance through indexing, aggregation pipelines, and query tuning.
- External Integrations & Debugging
- Integrate third-party APIs (payment gateways, analytics tools, etc.) and SDKs seamlessly into backend systems.
- Debug and resolve complex issues in production environments with a systematic, data-driven approach.
AWS & Cloud Services
Work with AWS services like Lambda (serverless), SQS (message queuing), S3 (storage), and EC2 (compute) to build resilient and scalable solutions.
Collaboration & Best Practices
Collaborate with frontend teams to ensure smooth API integrations and data flow.
Document code, write unit/integration tests, and enforce coding standards.
Mandatory Requirements
3–5 years of professional experience in Node.js and MongoDB.
Expertise in:
- MongoDB: Collection design, indexing, aggregation pipelines, replica sets, and sharding.
- Node.js: Asynchronous programming, middleware, and API development (Express.js/Fastify).
- Query Optimization: Writing efficient queries for large datasets (crores of records).
- Strong debugging skills and experience in resolving production issues.
- Hands-on experience with external integrations (APIs, SDKs, webhooks).
- Knowledge of horizontal/vertical scaling techniques and performance tuning.
- Familiarity with AWS services (Lambda, SQS, S3, EC2).
Preferred Skills
- Experience with microservices architecture.
- Knowledge of CI/CD pipelines (GitLab CI, Jenkins).
- Understanding of Docker, Kubernetes, or serverless frameworks.
- Exposure to monitoring tools like Prometheus, Grafana, or New Relic.
Why Join Inncircles?
Solve large-scale data challenges in the construction domain.
Work on cutting-edge cloud-native backend systems.
Competitive salary, flexible work culture, and growth opportunities.
Apply Now:
If you’re passionate about building scalable backend systems and thrive in a data-heavy environment, share your resume and a GitHub/portfolio link showcasing projects with Node.js, MongoDB, and AWS integrations.
Inncircles – THE INNGINEERING COMPANY
📍 Hyderabad | 🚀 Building Tomorrow’s Tech Today


Job Title : MERN Stack Developer
Experience : 5+ Years
Shift Timings : 8:00 AM to 5:00 PM
Role Overview:
We are hiring a skilled MERN Stack Developer to build scalable web applications. You’ll work on both front-end and back-end, leveraging modern frameworks and cloud technologies to deliver high-quality solutions.
Key Responsibilities :
- Develop responsive UIs using React, GraphQL, and TypeScript.
- Build back-end APIs with Node.js, Express, and MySQL.
- Integrate AWS services like Lambda, S3, and API Gateway.
- Optimize deployments using AWS CDK and CloudFormation.
- Ensure code quality with Mocha/Chai/Sinon, ESLint, and Prettier.
Required Skills :
- Strong experience with React, Node.js, and GraphQL.
- Proficiency in AWS services and Infrastructure as Code (CDK/Terraform).
- Familiarity with MySQL, Elasticsearch, and modern testing frameworks.


Job description
Key Responsibilities
- Design, develop, and maintain serverless applications using AWS services such as Lambda, API Gateway, DynamoDB, and S3.
- Collaborate with front-end developers to integrate user-facing elements with server-side logic.
- Build and maintain RESTful APIs to support web and mobile applications.
- Implement security best practices for AWS services and manage IAM roles and policies.
- Optimize application performance, scalability, and reliability through monitoring and testing.
- Write clean, maintainable, and efficient code following best practices and design patterns.
- Participate in code reviews, providing constructive feedback to peers.
- Troubleshoot and debug applications, identifying performance bottlenecks and areas for improvement.
- Stay updated with emerging technologies and industry trends related to serverless architectures and Python development.
Qualifications
- Bachelors degree in Computer Science, Engineering, or related field, or equivalent experience.
- Proven experience as a Python backend developer, with a strong portfolio of serverless applications.
- Proficiency in AWS services, particularly in serverless architectures (Lambda, API Gateway, DynamoDB, etc.).
- Solid understanding of RESTful API design principles and best practices.
- Familiarity with CI/CD practices and tools (e.g., AWS CodePipeline, Jenkins).
- Experience with containerization technologies (Docker, Kubernetes) is a plus.
- Strong problem-solving skills and the ability to work independently and collaboratively.
- Excellent communication skills, both verbal and written.
Preferred Skills
- Experience with frontend technologies (JavaScript, React, Angular) is a plus.
- Knowledge of data storage solutions (SQL and NoSQL databases).
- AWS certifications (e.g., AWS Certified Developer Associate) are a plus.
Qualifications:*
1. 10+ years of experience, with 3+ years as Database Architect or related role
2. Technical expertise in data schemas, Amazon Redshift, Amazon S3, and Data Lakes
3. Analytical skills in data warehouse design and business intelligence
4. Strong problem-solving and strategic thinking abilities
5. Excellent communication skills
6. Bachelor's degree in Computer Science or related field; Master's degree preferred
*Skills Required:*
1. Database architecture and design
2. Data warehousing and business intelligence
3. Cloud-based data infrastructure (Amazon Redshift, S3, Data Lakes)
4. Data governance and security
5. Analytical and problem-solving skills
6. Strategic thinking and communication
7. Collaboration and team management
Technical Skills:
- Ability to understand and translate business requirements into design.
- Proficient in AWS infrastructure components such as S3, IAM, VPC, EC2, and Redshift.
- Experience in creating ETL jobs using Python/PySpark.
- Proficiency in creating AWS Lambda functions for event-based jobs.
- Knowledge of automating ETL processes using AWS Step Functions.
- Competence in building data warehouses and loading data into them.
Responsibilities:
- Understand business requirements and translate them into design.
- Assess AWS infrastructure needs for development work.
- Develop ETL jobs using Python/PySpark to meet requirements.
- Implement AWS Lambda for event-based tasks.
- Automate ETL processes using AWS Step Functions.
- Build data warehouses and manage data loading.
- Engage with customers and stakeholders to articulate the benefits of proposed solutions and frameworks.


We're seeking an experienced Senior Tech Lead to oversee both frontend and backend teams for a production-ready enterprise project. You should possess strong managerial skills along with an entrepreneurial mindset. In this dynamic role, you'll collaborate with cross-functional teams to design, build, and deploy products aligned with our vision and strategy. Your leadership will be key in driving product success from conception to launch, ensuring they meet business objectives and user expectations.
Experience: 7+ Years
Working Time: 12.30 PM to 9.30 PM
Responsibilities:
- Lead and mentor developers: Provide guidance and support to ensure high-quality deliverables and drive engineering best practices. Experienced in leading development teams, fostering collaboration, and mentoring junior engineers for adherence to best practices.
- Collaborate with cross-functional teams: Define project requirements, timelines, and priorities, and coordinate product releases.
- Architect scalable solutions: Design systems for both frontend and backend that are maintainable using modern architecture patterns and RESTful API design principles.
- Project Management Skills: Proficient in Agile methodologies, managing timelines, and prioritizing tasks effectively to ensure project success.
- Define product features: Set sprint goals and translate user feedback into actionable enhancements.
- Problem-Solving Skills: Analytical problem-solver adept at addressing technical challenges and implementing practical solutions.
- Analyze data: Validate product goals and adapt strategies accordingly, and track project progress to ensure timely delivery of features.
- Test and accept product features: Ensure accurate implementation of product features based on user stories.
Requirements
- Bachelor's / Master’s degree in Computer Science or related field.
- Minimum of 3 years of experience in a leadership role.
- Nice to have: Experience in building a product from concept to launch.
- Excellent communication and interpersonal skills, with the ability to collaborate effectively across teams.
- Strong proficiency in NodeJS, RESTful APIs, Weaviate Vector Database, and graph databases.
- Proficient in NestJS with full lifecycle experience and expertise in MongoDB integration.
- Proficient in MongoDB, with expertise in NoSQL principles, instance management, data modeling, and efficient query optimization for cloud and on-premise environments.
- Strong proficiency in ReactJS, NextJS, MaterialUI, and React Query.
- Proficient in TypeScript development, skilled in building type-safe applications and leveraging TypeScript configurations for enhanced development efficiency.
- Proficient in AWS services, specializing in Lambda for serverless computing, API,Gateway for secure API management, and integration with IAM, S3, and CloudWatch.
- Knowledge of CI/CD principles and tools to automate the testing and deployment of applications.
Benefits
- Gain real-world experience in corporate functioning.
- Learn to collaborate with diverse teams and meet deadlines in a professional environment.
- Access various learning and development programs to explore your passion.
- Work in a fast-paced, rapidly expanding tech team undergoing a revamp, with exposure to advanced technology and tools relevant to your role

We are Seeking:
1. AWS Serverless, AWS CDK:
Proficiency in developing serverless applications using AWS Lambda, API Gateway, S3, and other relevant AWS services.
Experience with AWS CDK for defining and deploying cloud infrastructure.
Knowledge of serverless design patterns and best practices.
Understanding of Infrastructure as Code (IaC) concepts.
Experience in CI/CD workflows with AWS CodePipeline and CodeBuild.
2. TypeScript, React/Angular:
Proficiency in TypeScript.
Experience in developing single-page applications (SPAs) using React.js or Angular.
Knowledge of state management libraries like Redux (for React) or RxJS (for Angular).
Understanding of component-based architecture and modern frontend development practices.
3. Node.js:
Strong proficiency in backend development using Node.js.
Understanding of asynchronous programming and event-driven architecture.
Familiarity with RESTful API development and integration.
4. MongoDB/NoSQL:
Experience with NoSQL databases and their use cases.
Familiarity with data modeling and indexing strategies in NoSQL databases.
Ability to integrate NoSQL databases into serverless architectures.
5. CI/CD:
Ability to troubleshoot and debug CI/CD pipelines.
Knowledge of automated testing practices and tools.
Understanding of deployment automation and release management processes.
Educational Background: Bachelor's degree in Computer Science, Engineering, or a related field.
Certification(Preferred-Added Advantage):AWS certifications (e.g., AWS Certified Developer - Associate)

Simply Fleet is a fast-growing SaaS solution to help automate an organization's fleet maintenance operations. You can learn more about our product by going to www.simply-fleet.com
We are looking for an enthusiastic and proactive Android developer to manage the app development of Simply Fleet on Android. The developer will work closely with the other members of the product team to build and maintain Simply Fleet's Android app.
What you will do:
- You should be proficient in Android development with Kotlin
- You should have a fair idea about web services
- You should be comfortable working with JSON
- You should have a strong knowledge of Android UI design principles, patterns, and best practices
- You should have a working knowledge of location services
- Knowledge of AWS S3 is a plus
- You should have an understanding of code versioning tools, such as Git
- You should have deployed apps on Google Play Console
- We expect you to be proficient in best coding practices like adding comments, using proper naming conventions, performing unit testing of your code, etc.
- You should be well versed in developing in Android Studio
Who you are?
- You should have 2+ years of experience in Android development
- You should be committed since we follow a hybrid model
- You are expected to be present in our physical office in Pune, MH twice a week
- You should be willing to take complete ownership of your work
- Above all, you should be able to think independently and creatively
You can expect a smooth onboarding process with structured timelines. You can expect teams that listen and learn. You can expect to be counted on, and you'll be given the freedom to do your best work. We build our product, our teams, and our company for the long haul, so you can build your career here if you choose to. This is your platform to be a part of a growing startup and to work with some really awesome folks. We will make sure you have fun along the way.


Required a full stack Senior SDE with focus on Backend microservices/ modular monolith with 3-4+ years of experience on the following:
- Bachelor’s or Master’s degree in Computer Science or equivalent industry technical skills
- Mandatory In-depth knowledge and strong experience in Python programming language.
- Expertise and significant work experience in Python with Fastapi and Async frameworks.
- Prior experience building Microservice and/or modular monolith.
- Should be an expert Object Oriented Programming and Design Patterns.
- Has knowledge and experience with SQLAlchemy/ORM, Celery, Flower, etc.
- Has knowledge and experience with Kafka / RabbitMQ, Redis.
- Experience in Postgres/ Cockroachdb.
- Experience in MongoDB/DynamoDB and/or Cassandra are added advantages.
- Strong experience in either AWS services (e.g, EC2, ECS, Lambda, StepFunction, S3, SQS, Cognito). and/or equivalent Azure services preferred.
- Experience working with Docker required.
- Experience in socket.io added advantage
- Experience with CI/CD e.g. git actions preferred.
- Experience in version control tools Git etc.
This is one of the early positions for scaling up the Technology team. So culture-fit is really important.
- The role will require serious commitment, and someone with a similar mindset with the team would be a good fit. It's going to be a tremendous growth opportunity. There will be challenging tasks. A lot of these tasks would involve working closely with our AI & Data Science Team.
- We are looking for someone who has considerable expertise and experience on a low latency highly scaled backend / fullstack engineering stack. The role is ideal for someone who's willing to take such challenges.
- Coding Expectation – 70-80% of time.
- Has worked with enterprise solution company / client or, worked with growth/scaled startup earlier.
- Skills to work effectively in a distributed and remote team environment.



- AWS Cloud Solutions: We expect you to have a strong understanding of AWS services and how to architect solutions using them.
- Use of DynamoDB, S3, Vault, Lambda, or other AWS infrastructure components.
- Design scalable, highly available, and fault-tolerant cloud architectures.
- Continuous Integration and Deployment (CI/CD): Understanding CI/CD pipelines and tools like AWS CodePipeline, CodeCommit, and CodeDeploy is essential.
- Advanced concepts in React Native.


- Python knowledge: object-oriented programming: inheritance, abstract classes, dataclass, dependency injection, design patterns: comand-query, repository, adapter, hexagonal architecture, swagger/Open API, flask, connexion
- Experience on AWS services: lambda, ecs, sqs, s3, dynamodb, auroradb
- Experience with following libraries boto3, behave, pytest, moto, localstack, docker
- Basic knowledge about terraform, gitlab ci
- Experience with SQL DB


Python Developer
6-8 Years
Mumbai
N.p only immediate or who is serving LwD is 1st week of july.
- Python knowledge: object-oriented programming: inheritance, abstract classes, dataclass, dependency injection, design patterns: comand-query, repository, adapter, hexagonal architecture, swagger/Open API, flask, connexion
- Experience on AWS services: lambda, ecs, sqs, s3, dynamodb, auroradb
- Experience with following libraries boto3, behave, pytest, moto, localstack, docker
- Basic knowledge about terraform, gitlab ci
- Experience with SQL DB


Requirements
Experience
- 5+ years of professional experience in implementing MLOps framework to scale up ML in production.
- Hands-on experience with Kubernetes, Kubeflow, MLflow, Sagemaker, and other ML model experiment management tools including training, inference, and evaluation.
- Experience in ML model serving (TorchServe, TensorFlow Serving, NVIDIA Triton inference server, etc.)
- Proficiency with ML model training frameworks (PyTorch, Pytorch Lightning, Tensorflow, etc.).
- Experience with GPU computing to do data and model training parallelism.
- Solid software engineering skills in developing systems for production.
- Strong expertise in Python.
- Building end-to-end data systems as an ML Engineer, Platform Engineer, or equivalent.
- Experience working with cloud data processing technologies (S3, ECR, Lambda, AWS, Spark, Dask, ElasticSearch, Presto, SQL, etc.).
- Having Geospatial / Remote sensing experience is a plus.
Role : Principal Devops Engineer
About the Client
It is a Product base company that has to build a platform using AI and ML technology for their transportation and logiticsThey also have a presence in the global market
Responsibilities and Requirements
• Experience in designing and maintaining high volume and scalable micro-services architecture on cloud infrastructure
• Knowledge in Linux/Unix Administration and Python/Shell Scripting
• Experience working with cloud platforms like AWS (EC2, ELB, S3, Auto-scaling, VPC, Lambda), GCP, Azure
• Knowledge in deployment automation, Continuous Integration and Continuous Deployment (Jenkins, Maven, Puppet, Chef, GitLab) and monitoring tools like Zabbix, Cloud Watch Monitoring, Nagios
• Knowledge of Java Virtual Machines, Apache Tomcat, Nginx, Apache Kafka, Microservices architecture, Caching mechanisms
• Experience in enterprise application development, maintenance and operations
• Knowledge of best practices and IT operations in an always-up, always-available service
• Excellent written and oral communication skills, judgment and decision-making skill


Job Description:
We are looking for a talented Full Stack Developer with a strong background in Node.js, React.js, and AWS to contribute to the development and maintenance of our web applications. As a Full Stack Developer, you will work closely with cross-functional teams to design, develop, and deploy scalable and high-performance software solutions.
Responsibilities:
Collaborate with product managers and designers to translate requirements into technical specifications and deliver high-quality software solutions.
Develop and maintain web applications using Node.js and React.js frameworks.
Write clean, efficient, and well-documented code to ensure the reliability and maintainability of the software.
Implement responsive user interfaces, ensuring a seamless user experience across different devices and platforms.
Integrate third-party APIs and services to enhance application functionality.
Design and optimize databases to ensure efficient data storage and retrieval.
Deploy and manage applications on AWS cloud infrastructure, utilizing services such as EC2, S3, Lambda, and API Gateway.
Monitor and troubleshoot application performance, identify and resolve issues proactively.
Conduct code reviews to maintain code quality standards and provide constructive feedback to team members.
Stay up to date with the latest trends and best practices in web development and cloud technologies.
Requirements:
Proven experience as a Full Stack Developer, working with Node.js and React.js in a professional setting.
Strong proficiency in JavaScript and familiarity with modern front-end frameworks and libraries.
Experience with AWS services, such as EC2, S3, Lambda, API Gateway, and CloudFormation.
Knowledge of database systems, both SQL and NoSQL, and the ability to design efficient data models.
Familiarity with version control systems (e.g., Git) and agile development methodologies.
Ability to write clean, efficient, and well-documented code, following best practices and coding standards.
Strong problem-solving skills and the ability to work effectively in a fast-paced environment.
Excellent communication and collaboration skills, with the ability to work well in a team.

Roles & Responsibilities:
- Bachelor’s degree in Computer Science, Information Technology or a related field
- Experience in designing and maintaining high volume and scalable micro-services architecture on cloud infrastructure
- Knowledge in Linux/Unix Administration and Python/Shell Scripting
- Experience working with cloud platforms like AWS (EC2, ELB, S3, Auto-scaling, VPC, Lambda), GCP, Azure
- Knowledge in deployment automation, Continuous Integration and Continuous Deployment (Jenkins, Maven, Puppet, Chef, GitLab) and monitoring tools like Zabbix, Cloud Watch Monitoring, Nagios Knowledge of Java Virtual Machines, Apache Tomcat, Nginx, Apache Kafka, Microservices architecture, Caching mechanisms
- Experience in enterprise application development, maintenance and operations
- Knowledge of best practices and IT operations in an always-up, always-available service
- Excellent written and oral communication skills, judgment and decision-making skills
Description
Do you dream about code every night? If so, we’d love to talk to you about a new product that we’re making to enable delightful testing experiences at scale for development teams who build modern software solutions.
What You'll Do
Troubleshooting and analyzing technical issues raised by internal and external users.
Working with Monitoring tools like Prometheus / Nagios / Zabbix.
Developing automation in one or more technologies such as Terraform, Ansible, Cloud Formation, Puppet, Chef will be preferred.
Monitor infrastructure alerts and take proactive action to avoid downtime and customer impacts.
Working closely with the cross-functional teams to resolve issues.
Test, build, design, deployment, and ability to maintain continuous integration and continuous delivery process using tools like Jenkins, maven Git, etc.
Work in close coordination with the development and operations team such that the application is in line with performance according to the customer's expectations.
What you should have
Bachelor’s or Master’s degree in computer science or any related field.
3 - 6 years of experience in Linux / Unix, cloud computing techniques.
Familiar with working on cloud and datacenter for enterprise customers.
Hands-on experience on Linux / Windows / Mac OS’s and Batch/Apple/Bash scripting.
Experience with various databases such as MongoDB, PostgreSQL, MySQL, MSSQL.
Familiar with AWS technologies like EC2, S3, Lambda, IAM, etc.
Must know how to choose the best tools and technologies which best fit the business needs.
Experience in developing and maintaining CI/CD processes using tools like Git, GitHub, Jenkins etc.
Excellent organizational skills to adapt to a constantly changing technical environment
DATA ENGINEER
Overview
They started with a singular belief - what is beautiful cannot and should not be defined in marketing meetings. It's defined by the regular people like us, our sisters, our next-door neighbours, and the friends we make on the playground and in lecture halls. That's why we stand for people-proving everything we do. From the inception of a product idea to testing the final formulations before launch, our consumers are a part of each and every process. They guide and inspire us by sharing their stories with us. They tell us not only about the product they need and the skincare issues they face but also the tales of their struggles, dreams and triumphs. Skincare goes deeper than skin. It's a form of self-care for many. Wherever someone is on this journey, we want to cheer them on through the products we make, the content we create and the conversations we have. What we wish to build is more than a brand. We want to build a community that grows and glows together - cheering each other on, sharing knowledge, and ensuring people always have access to skincare that really works.
Job Description:
We are seeking a skilled and motivated Data Engineer to join our team. As a Data Engineer, you will be responsible for designing, developing, and maintaining the data infrastructure and systems that enable efficient data collection, storage, processing, and analysis. You will collaborate with cross-functional teams, including data scientists, analysts, and software engineers, to implement data pipelines and ensure the availability, reliability, and scalability of our data platform.
Responsibilities:
Design and implement scalable and robust data pipelines to collect, process, and store data from various sources.
Develop and maintain data warehouse and ETL (Extract, Transform, Load) processes for data integration and transformation.
Optimize and tune the performance of data systems to ensure efficient data processing and analysis.
Collaborate with data scientists and analysts to understand data requirements and implement solutions for data modeling and analysis.
Identify and resolve data quality issues, ensuring data accuracy, consistency, and completeness.
Implement and maintain data governance and security measures to protect sensitive data.
Monitor and troubleshoot data infrastructure, perform root cause analysis, and implement necessary fixes.
Stay up-to-date with emerging technologies and industry trends in data engineering and recommend their adoption when appropriate.
Qualifications:
Bachelor’s or higher degree in Computer Science, Information Systems, or a related field.
Proven experience as a Data Engineer or similar role, working with large-scale data processing and storage systems.
Strong programming skills in languages such as Python, Java, or Scala.
Experience with big data technologies and frameworks like Hadoop, Spark, or Kafka.
Proficiency in SQL and database management systems (e.g., MySQL, PostgreSQL, or Oracle).
Familiarity with cloud platforms like AWS, Azure, or GCP, and their data services (e.g., S3, Redshift, BigQuery).
Solid understanding of data modeling, data warehousing, and ETL principles.
Knowledge of data integration techniques and tools (e.g., Apache Nifi, Talend, or Informatica).
Strong problem-solving and analytical skills, with the ability to handle complex data challenges.
Excellent communication and collaboration skills to work effectively in a team environment.
Preferred Qualifications:
Advanced knowledge of distributed computing and parallel processing.
Experience with real-time data processing and streaming technologies (e.g., Apache Kafka, Apache Flink).
Familiarity with machine learning concepts and frameworks (e.g., TensorFlow, PyTorch).
Knowledge of containerization and orchestration technologies (e.g., Docker, Kubernetes).
Experience with data visualization and reporting tools (e.g., Tableau, Power BI).
Certification in relevant technologies or data engineering disciplines.
Responsibilities
- Implement various development, testing, automation tools, and IT infrastructure
- Design, build and automate the AWS infrastructure (VPC, EC2, Networking, EMR, RDS, S3, ALB, Cloud Front, etc.) using Terraform
- Manage end-to-end production workloads hosted on Docker and AWS
- Automate CI pipeline using Groovy DSL
- Deploy and configure Kubernetes clusters (EKS)
- Design and build a CI/CD Pipeline to deploy applications using Jenkins and Docker
Eligibility
- At least 8 years of proven experience in AWS-based DevOps/cloud engineering and implementations
- Expertise in all common AWS Cloud services like EC2, EKS, S3, VPC, Lambda, API Gateway, ALB, Redis, etc.
- Experience in deploying and managing production environments in Amazon AWS
- Strong experience in continuous integration and continuous deployment
- Knowledge of application build, deployment, and configuration using one of the tools: Jenkins
Qualifications & Experience:
▪ 2 - 4 years overall experience in ETLs, data pipeline, Data Warehouse development and database design
▪ Software solution development using Hadoop Technologies such as MapReduce, Hive, Spark, Kafka, Yarn/Mesos etc.
▪ Expert in SQL, worked on advanced SQL for at least 2+ years
▪ Good development skills in Java, Python or other languages
▪ Experience with EMR, S3
▪ Knowledge and exposure to BI applications, e.g. Tableau, Qlikview
▪ Comfortable working in an agile environment
Job DescriptionPosition: Sr Data Engineer – Databricks & AWS
Experience: 4 - 5 Years
Company Profile:
Exponentia.ai is an AI tech organization with a presence across India, Singapore, the Middle East, and the UK. We are an innovative and disruptive organization, working on cutting-edge technology to help our clients transform into the enterprises of the future. We provide artificial intelligence-based products/platforms capable of automated cognitive decision-making to improve productivity, quality, and economics of the underlying business processes. Currently, we are transforming ourselves and rapidly expanding our business.
Exponentia.ai has developed long-term relationships with world-class clients such as PayPal, PayU, SBI Group, HDFC Life, Kotak Securities, Wockhardt and Adani Group amongst others.
One of the top partners of Cloudera (leading analytics player) and Qlik (leader in BI technologies), Exponentia.ai has recently been awarded the ‘Innovation Partner Award’ by Qlik in 2017.
Get to know more about us on our website: http://www.exponentia.ai/ and Life @Exponentia.
Role Overview:
· A Data Engineer understands the client requirements and develops and delivers the data engineering solutions as per the scope.
· The role requires good skills in the development of solutions using various services required for data architecture on Databricks Delta Lake, streaming, AWS, ETL Development, and data modeling.
Job Responsibilities
• Design of data solutions on Databricks including delta lake, data warehouse, data marts and other data solutions to support the analytics needs of the organization.
• Apply best practices during design in data modeling (logical, physical) and ETL pipelines (streaming and batch) using cloud-based services.
• Design, develop and manage the pipelining (collection, storage, access), data engineering (data quality, ETL, Data Modelling) and understanding (documentation, exploration) of the data.
• Interact with stakeholders regarding data landscape understanding, conducting discovery exercises, developing proof of concepts and demonstrating it to stakeholders.
Technical Skills
• Has more than 2 Years of experience in developing data lakes, and datamarts on the Databricks platform.
• Proven skill sets in AWS Data Lake services such as - AWS Glue, S3, Lambda, SNS, IAM, and skills in Spark, Python, and SQL.
• Experience in Pentaho
• Good understanding of developing a data warehouse, data marts etc.
• Has a good understanding of system architectures, and design patterns and should be able to design and develop applications using these principles.
Personality Traits
• Good collaboration and communication skills
• Excellent problem-solving skills to be able to structure the right analytical solutions.
• Strong sense of teamwork, ownership, and accountability
• Analytical and conceptual thinking
• Ability to work in a fast-paced environment with tight schedules.
• Good presentation skills with the ability to convey complex ideas to peers and management.
Education:
BE / ME / MS/MCA.


JD / Skills Sets
1. Good knowledge on Python
2. Good knowledge on My-Sql, mongodb
3. Design Pattern
4. OOPs
5. Automation
6. Web scraping
7. Redis queue
8. Basic idea of Finance Domain will be beneficial.
9. Git10. AWS (EC2, RDS, S3)



Job Title: PHP (Laravel) Developer
Experience: 2 to 7 years
Skills:
PHP : Laravel, (MVC)
Database: MySQL
{(added Advantage)
Database : Mongo DB
Server Hosting : – AWS (Knowledge of EC2, S3, RDS & Route53)
Good to have experience in ReactJS, NodeJS, VueJS}
Communication skills: Must have good in communication
.
Requirements:
Understanding of MVC design patterns
Basic understanding of front-end technologies, such as JavaScript, HTML5, and
CSS3
Knowledge of object oriented PHP programming
In-depth knowledge of object-oriented PHP 7.x and Laravel 5/6+ PHP Framework
Experience with MVC, Entity Framework, Web form, Web API and business layer
and
front-end technologies
Creating database schema that represent and support business processes
Familiarity with SQL/NoSQL databases and their declarative query languages
In-depth knowledge of Git,Bitbucket and related pipeline for Continues integration
and
Continues deployment
Creative and efficient problem solving capability
Understanding of Agile development process
Developing rich and complex web applications in an efficient manner so that the
applications let the user interact with the site or application smoothly.
Ability to understand technical documents like SRS, Design Document & Wireframes.
Personal Specifications –
- Proficiency in written and spoken English(must)
- Understanding of project development methodologies like Agile is preferred.
- Understand team development/Source code control


· 4+ years of experience as a Python Developer.
· Good Understanding of Object-Oriented Concepts and Solid principles.
· Good Understanding in Programming and analytical skills.
· Should have hands on experience in AWS Cloud Service like S3, Lambda functions Knowledge. (Must Have)
· Should have experience Working with large datasets (Must Have)
· Proficient in using NumPy, Pandas. (Must Have)
· Should have hands on experience on Mysql (Must Have)
· Should have experience in debugging Python applications (Must have)
· Knowledge of working on Flask.
· Knowledge of object-relational mapping (ORM).
· Able to integrate multiple data sources and databases into one system
· Proficient understanding of code versioning tools such as Git, SVN
· Strong at problem-solving and logical abilities
· Sound knowledge of Front-end technologies like HTML5, CSS3, and JavaScript
· Strong commitment and desire to learn and grow.




Summary:
The Learner Company is an education start-up that designs personalized learning experiences by integrating them with the best of what technology offers. We are currently building an online learning engine to host adaptive online courses, simulations, and multiplayer games for institutional partners. We are now in the software development stage of the project.
We are looking for a full-stack developer to join our development team. The developer will be responsible for the overall development and implementation of front and back-end software applications. Their responsibilities will extend from designing system architecture to high-level programming, performance testing, and systems integration.
We are looking for an individual who is optimistic about technology and people, is open to and excited by new ideas, and considers themselves a life-long learner.
Responsibilities:
- Meeting with the software development team to define the scope and scale of software projects.
- Designing software system architecture.
- Completing data structures and design patterns.
- Designing and implementing scalable web services, applications, and APIs.
- Developing and maintaining internal software tools.
- Writing low-level and high-level code.
- Troubleshooting and bug fixing.
- Identifying bottlenecks and improving software efficiency.
- Collaborating with the design team on developing micro-services.
- Writing technical documents.
Required Competencies:
- Bachelor’s degree in computer engineering or computer science.
- Previous experience as a full stack engineer.
- Advanced knowledge of front-end languages including HTML5, CSS, TypeScript, JavaScript, C++, JQuery, React.js and Next.js.
- Knowledge of relational database systems and SQL.
- Familiarity with AWS architecture and working knowledge of services like S3, SES, EC2, RDS and more.
- Proficient in back-end languages including Java, Python, Rails, Ruby, .NET, and PHP.
- Advanced troubleshooting skills.
- Familiarity with MS Word, Excel, PowerPoint, Notion, Veed.io, Linear, Intercom, Plateau, and Miro.
- A strong belief that a team as a whole is greater than the sum of its parts.
- Excellent leadership, communication, and organization skills
Experience Needed: 2+ Years
Location: Bengaluru
- Meeting with the software development team to define the scope and scale of software projects.
- Designing software system architecture.
- Completing data structures and design patterns.
- Designing and implementing scalable web services, applications, and APIs.
- Developing and maintaining internal software tools.
- Writing low-level and high-level code.
- Troubleshooting and bug fixing.
- Identifying bottlenecks and improving software efficiency.
- Collaborating with the design team on developing micro-services.
- Writing technical documents.


Java Developers [I+S/E2-MM2]
Java Full Stack Developer
We are looking for a skilled Full Stack Developer who is passionate about building high-quality software applications. The ideal candidate will have expertise in Java frameworks and extensions, persistence frameworks, servers, platforms, clouds, databases, data storage, and QA tools. The candidate should also have experience working with Angular, React, or Vue.
As a Full Stack Developer, you will be responsible for developing and maintaining software applications for our clients. You will work closely with a team of developers and project managers to deliver high-quality software products. You should be comfortable working in a fast-paced environment and be able to adapt to changing priorities.
Mandatory Skill Sets:
Java frameworks and extensions: You should be proficient in building enterprise-grade applications using Java 8+ and Spring Boot.
Persistence Frameworks: You should have experience working with Hibernate and/or JPA. You should be able to design and develop efficient data models, and perform CRUD operations using Hibernate and/or JPA.
Servers: You should be familiar with Apache Tomcat and be able to deploy applications on Tomcat servers.
Platforms: You should have experience working with Java EE and Jakarta 2EE platforms.
Clouds: You should have experience working with AWS, and be familiar with AWS services such as EC2, S3, and RDS.
Databases / Data Storage: You should have experience working with MYSQL and Oracle databases
QA Tools: You should be proficient in JUnit5 and Postman. You should be able to write and execute unit tests, integration tests, and end-to-end tests using these tools.
Web Services: You should have experience working with RESTful web services.
API Security: You should be familiar with OAuth2, JWT, Auth0, or any other API security frameworks.
Angular/React/Vue: You should have experience working with at least one of these frontend frameworks, HTML, CSS, and JavaScript.
If you are passionate about building high-quality software applications and have the required skill sets, we encourage you to apply. We offer competitive salaries and benefits, and a challenging work environment where you can learn and grow.
About Kloud9:
Kloud9 exists with the sole purpose of providing cloud expertise to the retail industry. Our team of cloud architects, engineers and developers help retailers launch a successful cloud initiative so you can quickly realise the benefits of cloud technology. Our standardised, proven cloud adoption methodologies reduce the cloud adoption time and effort so you can directly benefit from lower migration costs.
Kloud9 was founded with the vision of bridging the gap between E-commerce and cloud. The E-commerce of any industry is limiting and poses a huge challenge in terms of the finances spent on physical data structures.
At Kloud9, we know migrating to the cloud is the single most significant technology shift your company faces today. We are your trusted advisors in transformation and are determined to build a deep partnership along the way. Our cloud and retail experts will ease your transition to the cloud.
Our sole focus is to provide cloud expertise to retail industry giving our clients the empowerment that will take their business to the next level. Our team of proficient architects, engineers and developers have been designing, building and implementing solutions for retailers for an average of more than 20 years.
We are a cloud vendor that is both platform and technology independent. Our vendor independence not just provides us with a unique perspective into the cloud market but also ensures that we deliver the cloud solutions available that best meet our clients' requirements.
What we are looking for:
● 3+ years’ experience developing Data & Analytic solutions
● Experience building data lake solutions leveraging one or more of the following AWS, EMR, S3, Hive& Spark
● Experience with relational SQL
● Experience with scripting languages such as Shell, Python
● Experience with source control tools such as GitHub and related dev process
● Experience with workflow scheduling tools such as Airflow
● In-depth knowledge of scalable cloud
● Has a passion for data solutions
● Strong understanding of data structures and algorithms
● Strong understanding of solution and technical design
● Has a strong problem-solving and analytical mindset
● Experience working with Agile Teams.
● Able to influence and communicate effectively, both verbally and written, with team members and business stakeholders
● Able to quickly pick up new programming languages, technologies, and frameworks
● Bachelor’s Degree in computer science
Why Explore a Career at Kloud9:
With job opportunities in prime locations of US, London, Poland and Bengaluru, we help build your career paths in cutting edge technologies of AI, Machine Learning and Data Science. Be part of an inclusive and diverse workforce that's changing the face of retail technology with their creativity and innovative solutions. Our vested interest in our employees translates to deliver the best products and solutions to our customers.


Responsibilities
- Develop new user-facing features using React.js and RESTful APIs using Node.js and MongoDB
- Build reusable code and libraries for future use
- Optimize applications for maximum speed and scalability
- Collaborate with team members, e.g designer, product and other stakeholders to ensure quality in product.
- Ensure the technical feasibility of UI/UX designs
- Manage and maintain cloud infrastructure on AWS
Qualifications
- At least 3-6 years of experience in MERN stack
- Proficiency with React.js, Node.js, MongoDB, and Express.js
- Familiarity with AWS services such as EC2, S3, and RDS, SQS, Lambda
- Understanding of RESTful API design principles
- Understanding of Agile software development methodologies
- Strong problem-solving and analytical skills
Mactores is a trusted leader among businesses in providing modern data platform solutions. Since 2008, Mactores have been enabling businesses to accelerate their value through automation by providing End-to-End Data Solutions that are automated, agile, and secure. We collaborate with customers to strategize, navigate, and accelerate an ideal path forward with a digital transformation via assessments, migration, or modernization.
We are looking for a DataOps Engineer with expertise while operating a data lake. Amazon S3, Amazon EMR, and Apache Airflow for workflow management are used to build the data lake.
You have experience of building and running data lake platforms on AWS. You have exposure to operating PySpark-based ETL Jobs in Apache Airflow and Amazon EMR. Expertise in monitoring services like Amazon CloudWatch.
If you love solving problems using yo, professional services background, usual and fun office environment that actively steers clear of rigid "corporate" culture, focuses on productivity and creativity, and allows you to be part of a world-class team while still being yourself.
What you will do?
- Operate the current data lake deployed on AWS with Amazon S3, Amazon EMR, and Apache Airflow
- Debug and fix production issues in PySpark.
- Determine the RCA (Root cause analysis) for production issues.
- Collaborate with product teams for L3/L4 production issues in PySpark.
- Contribute to enhancing the ETL efficiency
- Build CloudWatch dashboards for optimizing the operational efficiencies
- Handle escalation tickets from L1 Monitoring engineers
- Assign the tickets to L1 engineers based on their expertise
What are we looking for?
- AWS data Ops engineer.
- Overall 5+ years of exp in the software industry Exp in developing architecture data applications using python or scala, Airflow, and Kafka on AWS Data platform Experience and expertise.
- Must have set up or led the project to enable Data Ops on AWS or any other cloud data platform.
- Strong data engineering experience on Cloud platform, preferably AWS.
- Experience with data pipelines designed for reuse and use parameterization.
- Experience of pipelines was designed to solve common ETL problems.
- Understanding or experience on various AWS services can be codified for enabling DataOps like Amazon EMR, Apache Airflow.
- Experience in building data pipelines using CI/CD infrastructure.
- Understanding of Infrastructure as code for DataOps ennoblement.
- Ability to work with ambiguity and create quick PoCs.
You will be preferred if
- Expertise in Amazon EMR, Apache Airflow, Terraform, CloudWatch
- Exposure to MLOps using Amazon Sagemaker is a plus.
- AWS Solutions Architect Professional or Associate Level Certificate
- AWS DevOps Professional Certificate
Life at Mactores
We care about creating a culture that makes a real difference in the lives of every Mactorian. Our 10 Core Leadership Principles that honor Decision-making, Leadership, Collaboration, and Curiosity drive how we work.
1. Be one step ahead
2. Deliver the best
3. Be bold
4. Pay attention to the detail
5. Enjoy the challenge
6. Be curious and take action
7. Take leadership
8. Own it
9. Deliver value
10. Be collaborative
We would like you to read more details about the work culture on https://mactores.com/careers
The Path to Joining the Mactores Team
At Mactores, our recruitment process is structured around three distinct stages:
Pre-Employment Assessment:
You will be invited to participate in a series of pre-employment evaluations to assess your technical proficiency and suitability for the role.
Managerial Interview: The hiring manager will engage with you in multiple discussions, lasting anywhere from 30 minutes to an hour, to assess your technical skills, hands-on experience, leadership potential, and communication abilities.
HR Discussion: During this 30-minute session, you'll have the opportunity to discuss the offer and next steps with a member of the HR team.
At Mactores, we are committed to providing equal opportunities in all of our employment practices, and we do not discriminate based on race, religion, gender, national origin, age, disability, marital status, military status, genetic information, or any other category protected by federal, state, and local laws. This policy extends to all aspects of the employment relationship, including recruitment, compensation, promotions, transfers, disciplinary action, layoff, training, and social and recreational programs. All employment decisions will be made in compliance with these principles.
knowledge of EC2, RDS and S3.
● Good command of Linux environment
● Experience with tools such as Docker, Kubernetes, Redis, NodeJS and Nginx
Server configurations and deployment, Kafka, Elasticsearch, Ansible, Terraform,
etc
● Bonus: AWS certification is a plus
● Bonus: Basic understanding of database queries for relational databases such as
MySQL.
● Bonus: Experience with CI servers such as Jenkins, Travis or similar types
● Bonus: Demonstrated programming capability in a high-level programming
language such as Python, Go, or similar
● Develop, maintain and administer tools which will automate operational activities
and improve engineering productivity
● Automate continuous delivery and on-demand capacity management solutions
● Developing configuration and infrastructure solutions for internal deployments
● Troubleshooting, diagnosing and fixing software issues
● Updating, tracking and resolving technical issues
● Suggesting architecture improvements, recommending process improvements
● Evaluate new technology options and vendor products. Ensuring critical system
security through the use of best in class security solutions
● Technical experience or in a similar role supporting large scale production
distributed systems
● Must understand overall system architecture , improve design and implement new
processes.


About the role
Checking quality is one of the most important tasks at Anakin. Our clients are pricing their products based on our data, and minor errors on our end can lead to our client's losses of millions of dollars. You would work with multiple tools and with people across various departments to ensure the accuracy of the data being crawled. You would setup manual and automated processes and make sure they run to ensure the highest possible data quality.
You are the engineer other engineers can count on. You embrace every problem with enthusiasm. You remove hurdles, are a self-starter, team player. You have the hunger to venture into unknown areas and make the system work.
Your Responsibilities would be to:
- Understand customer web scraping and data requirements; translate these into test approaches that include exploratory manual/visual testing and any additional automated tests deemed appropriate
- Take ownership of the end-to-end QA process in newly-started projects
- Draw conclusions about data quality by producing basic descriptive statistics, summaries, and visualisations
- Proactively suggest and take ownership of improvements to QA processes and methodologies by employing other technologies and tools, including but not limited to: browser add-ons, Excel add-ons, UI-based test automation tools etc.
- Ensure that project requirements are testable; work with project managers and/or clients to clarify ambiguities before QA begins
- Drive innovation and advanced validation and analytics techniques to ensure data quality for Anakin's customers
- Optimize data quality codebases and systems to monitor the Anakin family of app crawlers
- Configure and optimize the automated and manual testing and deployment systems used to check the quality of billions of data points of over 1000+ crawlers across the company
- Analyze data and bugs that require in-depth investigations
- Interface directly with external customers including managing relationships and steering requirements
Basic Qualifications:
- 2+ years of experience as a backend or a full-stack software engineer
- Web scraping experience with Python or Node.js
- 2+ years of experience with AWS services such as EC2, S3, Lambda, etc.
- Should have managed a team of software engineers
- Should be paranoid about data quality
Preferred Skills and Experience:
- Deep experience with network debugging across all OSI layers (Wireshark)
- Knowledge of networks or/and cybersecurity
- Broad understanding of the landscape of software engineering design patterns and principles
- Ability to work quickly and accurately in a highly stressful environment during removing bugs in run-time within minutes
- Excellent communicator, both written and verbal
Additional Requirements:
- Must be available to work extended hours and weekends when needed to meet critical deadlines
- Must have an aversion to politics and BS. Should let his/her work speak for him/her.
- Must be comfortable with uncertainty. In almost all the cases, your job will be to figure it out.
- Must not be bounded to comfort zone. Often, you will need to challenge yourself to go above and beyond.
Skills


We are looking for a FULL-TIME OFFICE POSITION Java Developer (2-3 years experience) who is proficient with coding and can design, develop, test, and implement Java applications and resolve technical issues.
As a Java Developer, we are looking for a highly skilled candidate, who will be responsible for building great web applications in Java, analyzing business objectives and user requirements, suggesting necessary changes for the existing Java applications, compiling detailed technical documentation, and determining application functionalities and features.
Do you think you fit this description well? Then apply now!
Expected Responsibilities
- Understanding the business requirements
- Designing and developing the front end for customer-facing applications using the MVC framework
- Creating RESTful APIs for the front-end developers with and with ORM or raw query for any Database technologies like MySQL or MongoDB
- Creating self-contained, reusable, and testable modules and components
- Ensuring a clear dependency chain regarding the app logic as well as the file system
- Supporting continuous improvement by investigating alternatives and technologies and presenting these for architectural review
- Writing non-blocking code and resorting to advanced techniques, such as multi-threading, when required
Required Skills And Qualifications
- MCA/BTech degree in Computer Science, Engineering, or a related subject
- Thorough understanding of the responsibilities of the platform, database, API, caching layer, proxies, and other web services used in the system
- Experience with Docker or any other containerization tools
- Growth mindset and a positive and collaborative attitude
- Professional, precise communication skills
- Fluency in English, both written and spoken
Must-Haves
- At least 2-3 years of experience as a Java Developer
- Experience with Amazon Web Services (S3, Lambda, Elastic Beanstalk, and other AWS modules)
- Hands-on experience in Core Java, Spring Boot, and Spring framework (Embedded Tomcat), REST API skills
- Deep understanding of Java, servlets in J2EE, web-based request handling with Microservices architecture
- Experience with JavaScript-based front-end frameworks like Angular JS or React JS
- Good understanding of Design Patterns, Data structures, and Algorithms
- Exposure to Agile principles and methodologies, including Continuous Integration and Test Driven Development
- Comprehensive knowledge of OO design principles and development patterns
- Java Batch Scheduling (e.g., Flux, Quartz) knowledge
- Previous experience working with clean code, SOLID principles, TDD
A Little About Who We Are
Klizo Solutions, founded by Joseph Ricard, is an IT company that develops outstanding applications and techs in an enterprise environment, located near CC2 in the New Town Area, Kolkata.
But it isn’t the first brainchild of our founder, as he also has multiple start-ups to his credit, including one of the largest music apps in the Philippines and Italy and a first-of-its-kind Cannabis Vending Machine.
So, naturally, being a part of a company founded and run by such a visionary tech leader and a serial entrepreneur, our employees always have ample opportunities to learn and grow as the company evolves.
Currently, our big happy Klizo family consists of 50+ employees. But we want to extend the happiness of working and collaborating with talented individuals even further. And by this, we mean we will hire more talents over the next few weeks!
Perks Of Being A Klizonian
- Training will be provided (if required)
- 5-day working in a week
- On-time salary every month
- Cool and approachable management
- Numerous opportunities for growth
Job Type: Full Time


Interfaces with other processes and/or business functions to ensure they can leverage the
benefits provided by the AWS Platform process
Responsible for managing the configuration of all IaaS assets across the platforms
Hands-on python experience
Manages the entire AWS platform(Python, Flask, RESTAPI, serverless framework) and
recommend those that best meet the organization's requirements
Has a good understanding of the various AWS services, particularly: S3, Athena, Python code,
Glue, Lambda, Cloud Formation, and other AWS serverless resources.
AWS Certification is Plus
Knowledge of best practices for IT operations in an always-on, always-available service model
Responsible for the execution of the process controls, ensuring that staff comply with process
and data standards
Qualifications
Bachelor’s degree in Computer Science, Business Information Systems or relevant experience and
accomplishments
3 to 6 years of experience in the IT field
AWS Python developer
AWS, Serverless/Lambda, Middleware.
Strong AWS skills including Data Pipeline, S3, RDS, Redshift with familiarity with other components
like - Lambda, Glue, Step functions, CloudWatch
Must have created REST API with AWS Lambda.
Python relevant exp 3 years
Good to have Experience working on projects and problem solving with large scale multivendor
teams.
Good to have knowledge on Agile Development
Good knowledge on SDLC.
Hands on AWS Databases, (RDS, etc)
Good to have Unit testing exp.
Good to have CICD working knowledge.
Decent communication, as there will be client interaction and documentation.
Education (degree): Bachelor’s degree in Computer Science, Business Information Systems or relevant
experience and accomplishments
Years of Experience: 3-6 years
Technical Skills
Linux/Unix system administration
Continuous Integration/Continuous Delivery tools like Jenkins
Cloud provisioning and management – Azure, AWS, GCP
Ansible, Chef, or Puppet
Python, PowerShell & BASH
Job Details
JOB TITLE/JOB CODE: AWS Python Develop[er, III-Sr. Analyst
RC: TBD
PREFERRED LOCATION: HYDERABAD, IND
POSITION REPORTS TO: Manager USI T&I Cloud Managed Platform
CAREER LEVEL: 3
Work Location:
Hyderabad


About
Blend-ed, the brainchild of a team of educationalists and technologists who believe in science-backed learning, is an innovative and technology-rich platform that steps early into the futuristic blended learning system, envisioning a fundamental change in the prevailing conventional teaching and learning cultures. Working with us will be your contribution towards rewriting the history of education and to building the future of learning.
We are looking for a Senior Full Stack Developer to build scalable software solutions in the Ed-tech SaaS domain. In this role, you’ll have the sole responsibility for the full software development life cycle, from conception to deployment. As a Full Stack Developer, you should be comfortable around both front-end and back-end stack, development frameworks and third-party libraries.
Requirement : Technical Lead - Senior Full Stack Engineer.
Location : Bangalore, India.
Work Model : Hybrid
Experience : Preferably 3+ years in a similar role.
Joining Date : Immediate
Salary : Rs 12,00,000 LPA - Rs 20,00,000 LPA (negotiable)
Role Expectations & Responsibilities
- Work with product manager/s to ideate software solutions.
- lead the project from conception through implementation.
- Design client-side and server-side architecture.
- Build the front-end of applications through appealing visual design.
- Develop and manage well-functioning databases and applications.
- Write effective APIs.
- Test, Troubleshoot, debug and upgrade software.
- Create security and data protection settings.
- Write technical documentation.
- Strong organizational and mentoring skills.
Required Experience & Skills
- Proven experience as a Full Stack Developer or similar role.
- Experience developing web and mobile applications.
- Knowledge of multiple front-end languages and libraries (e.g. HTML/ CSS, JavaScript, React etc.)
- Knowledge of back-end languages (e.g. Python/ NodeJS etc.).
- Familiarity with databases (e.g. PostgreSQL, MySQL, MongoDB), web servers (e.g. Apache, Nginx)
Desirable Experience & Skills
- Experience with Python web frameworks, specifically Django.
- Experience with React is a plus.
- AWS skills with knowledge of EC2, VPC, S3 etc.
- Experience in containerization (Docker) and container orchestration (Kubernetes).
Why Join Us:
At Blend-ed, we are committed to create a diverse work environment with a range of opportunities for professional and personal development of all our employees. We make learning easier, more efficient, more user-friendly and far more accessible, for all sorts of people, all across the world. As a member of team Blend-ed, you will work on most innovative projects that will revolutionise the learning ecosystem and make a real difference in millions of people’s learning experience.
Work Culture: With a task-driven office culture, we believe that optimal productivity is rooted in each individual's pace and strength, rather than rigid timelines. We follow a flexible work culture along with mutual understanding and support.
Growth at a personal level: Each member of our team gets the best nourishment in terms of opportunities, mental and moral support and a stage for their existing and enhancing skills, ensuring your personal growth alongside the company.
Creativity and Innovation: We root for creativity and innovation at work and this is a fixed formula to keep things interesting. You are welcome to lay your craziest ideas on the table or board for us to have conversations and coffee over.
Work Environment: Situated at Calicut, a bustling commercial centre in Kerala, which is emerged as one of the best cities to live in India in terms of education, health, entertainment, and the environment, our workspace in HiLite Business Park, flanked by the mall and residential areas, offers an environment of networking, convenience and class, which promotes an undeniably better life quality & experience.
Our Values For All Times:
Innovate, Always: We are committed to creativity and novelty because anything remarkable ever was born out of those.
Science-driven: We base the technological developments on theories and findings in cognitive science, making learning a richer experience.
Love it? Learn it! At Blend- Ed, learning is limited only by the amount of inquisitiveness and passion for exploring. We believe in unlimited exploration and pride in seeking knowledge at any cost.
Taller, Together: Growth is a teamwork at Blend- Ed. We work towards growing together, packing in the personal growth of each team member into the company's growth, mutual respect, motivation and inspiration ingrained into our conduct.
Better Tomorrows: Along with our dreams and goals of transforming the education system, we envision a better tomorrow for the world. The very idea of a better future is rooted in hope and we nurture it.
Ethics Uncompromised: Our work culture is founded in a code of ethics that preserves smooth relationships within the company and outside it, in order to meet our goals and other values.


Route to Smile is a patient-centric, tech-oriented brand in the Healthcare space.
With a team of experts and a mission to continuously innovate, Route to Smile is revolutionizing the Healthcare industry by providing customized solutions to facilitate accurate and precise clinical outcomes and patient-specific care with a focus on technological service delivery.
As a part of their latest offerings, Route to Smile is leveraging leading technologies including IoT, Industry 4.0 and AI/ML to further enable healthcare professionals to improve patient engagement and drive overall practice growth.
Job Description:
We are looking for an enthusiastic and motivated individual who is willing to work in Java technologies, with experience on front-end languages and libraries (e.g. Angular, HTML/ CSS, JavaScript, XML, jQuery, Vue.js, React) and its related back-end languages (e.g. Java) and frameworks (e.g. Spring boot, Node.js, Spring MVC, Hibernate, Struts, etc.) and who are also willing to work in challenging environment.
The candidate will be an integral member of a team that will employ the latest techniques and best practices in software development and utilize best-in-class tools and frameworks to build a first-of-their-kind cloud-agnostic, enterprise-class large-scale SaaS application.
Key Responsibilities:
- Responsible for building the architecture, key components, and middleware of the platform and developing fully multi-tenant systems
- Develop workflow management functions
- Develop REST APIs, as well as contribute to the overall API framework
- Implement solutions using iterative processes, Agile development methodologies and test-driven development
- Being a senior developer, you will also be required to lead a team of junior developers.
Technical Skills:
- 5 to 9 years of development experience with Java and related front-end and back-end technologies, and experience working in a distributed systems environment.
- Extensive experience in developing full stack end-to-end scalable and distributed applications.
- Experience in SaaS Application architecture & design and working with database modeling and design concepts. Understanding of database design and maintenance and experience with SQLite, PostgreSQL, and MySQL.
- Experience with developing loosely coupled design, Micro-services development, message queues, and customized application deployment using RESTful services, implementing REST APIs with Spring or JAX-RS.
- Working knowledge of either Spring boot, Spring MVC, Hibernate, Struts Framework, building REST based Web Services using Java EE Standards
- Have experience in single sign-on, multi-factor authentication and security background.
- Knowledge of Business Process Model and Notation (BPMN 2.0) workflow engines such as Activiti, jBPM, Orchestra, and Flowable, however not mandatory.
Good to have Skills:
- Familiarity with databases (e.g. MySQL, MongoDB), web servers (e.g. Apache) and UI/UX design.
- Good understanding of HTML5, CSS3, JavaScript.
- Good familiarity with Linux operating system, managing Linux servers and using AWS (EC2, S3).
- Familiarity with administering, automating, and deploying to cloud-based environments such as AWS, Google Cloud Platform, or Azure, or working experience with Kubernetes.
- Should have used GIT.
- Understanding of search technologies such as Solr, ElasticSearch, and Lucene
- Have working knowledge in CI/CD pipelines (Concourse/Jenkins), and at times wear multiple hats to double up as Dev Ops
- Experience being part of product teams and in handling integrations, good at communication and teamwork skills.
- Strong emphasis on quality and the ability to deliver quickly and consistently.
- Self-driven and motivated to work with cutting edge technology.

Hello,
Greetings from CodersBrain!
Coders Brain is a global leader in its services, digital, and business solutions that partners with its clients to simplify, strengthen, and transform their businesses. We ensure the highest levels of certainty and satisfaction through a deep-set commitment to our clients, comprehensive industry expertise, and a global network of innovation and delivery centers.
This is regarding the urgent opening for the'' ROR Developer role. We found your profile in the Cutshort database and it seems like a good fit for the organization. If you are interested, do revert back with your updated CV along with the details:
Permanent Payroll : CodersBrain
Location:- Mumbai/ Kolkata
Notice Period:- imm./15 daysJob Description
Scope of work This role is an exciting opportunity to build highly interactive
customer facing applications and products.
Candidate will help transform vast collections of data into
actionable insights with intuitive and easy to use interfaces and
visualizations.
Candidate will leverage the power of JavaScript and Ruby on
Rail to build novel features and improvements to our current suite
of tools.
This team is responsible for customer-facing applications that
deliver SEO data insights.
Through the applications customers are able to access insights,
workflows, and aggregations of information above and beyond
core data offerings
Responsibility
Build and maintain the core frontend application
Work collaboratively with the engineers on the Frontend team to
ensure quality and performance of the systems through code
reviews, documentation, analysis and employing engineering
best practices to ensure high-quality software
Contribute to org devops culture by maintaining our systems,
including creating documentation, run books, monitoring, alerting,
and integration tests, etc.
Participate in architecture design and development for new
features and capabilities, and for migration of legacy systems, to
meet business and customer needs.
Take turns in the on-call rotation, handling systems and
operations issues as they arise including responding to off-hours
alerts
Collaborate with other teams on dependent work and integrations
as well as be vigilant for activities happening outside the team
that would have an impact on the work your team is doing.
Work with Product Managers and UX Designers to deliver new
features and capabilities
Use good security practices to protect code and systems
Pitch in where needed during major efforts or when critical issues
arise
Give constructive, critical feedback to other team members
through pull requests, design reviews, and other methods
Seek out opportunities and work to grow skills and expertise.
C) Skills Required
Essential Skills JavaScript (preferred ExtJS framework)
Ruby
Ruby on Rails
MySQL
Docker
Desired Skills Terraform
Basic Unix/Linux administration
Redis
Resque
TravisCI
AWS ECS
AWS RDS
AWS EMR
AWS S3
AWS Step Functions
AWS Lambda Functions
Experience working remotely with a distributed team
Great problem-solving skills
D) Other Information
Educational Qualifications Bachelor's degree/MCA
Experience 5– 8 years
Please confirm the mail with your updated CV if you are interested in this position and also please share the below-mentioned details :
Current CTC:
Expected CTC:
Current Company:
Notice Period:
Notice Period:Are you okay with 1 week Notice period if not then comfortable as freelancing work with us till joining:-
Current Location:
Preferred Location:
Total-experience:
Relevant experience:
Highest qualification:
DOJ(If Offer in Hand from Other company):
offer in hand:
Alternate number:
Interview Availability


What is the role?
Xoxoday is looking for a candidate who has a strong background in the design and implementation of scalable architecture and a good understanding of Algorithms, Data structures, and design patterns. Candidates must be ready to learn new tools, languages, and technologies
Basic Qualifications:
- At least 4 -7 years of experience as a software developer.
- At Least 3 years of experience in .net core C#, aws stack, ms sql server, mvc, nodejs experience is a plus
- Strong working knowledge in distributed event-driven messaging architecture/platform
- Strong knowledge in data access layer especially ability to work with stored procedure
- Established and stimulated software development standards and processes along with best practices for delivery of scalable and high-quality software.
- Production experience with AWS stack
- Fluent English speaker
Preferred Qualifications:
- Experience working with OOP languages.
- Experience designing and developing Microservices and SOA.
- Experience working with AWS Kinesis, Lambda, SQS, S3, ElastiCache, ElasticSearch, Kubernetes, EventBridge, RDS, CloudWatch, APIGateway
- Experience designing and building high-performance scalable web services.
- Experience in REST API design and implementation.
- Experience in unit testing, test automation, and continuous delivery.
- Experience with stream-processing and message-broker software.
Nice to have:
- Experience working with distributed teams.
- Ability to work independently and as part of a team.
- Ability to work quickly toward tight deadlines, and make smart tradeoffs between speed, accuracy, and maintainability.
- Bachelor's or Master's degree in computer science (or equivalent professional experience).
What can you look for?
A wholesome opportunity in a fast-paced environment that will enable you to juggle between concepts, yet maintain the quality of content, interact, and share your ideas and have loads of learning while at work. Work with a team of highly talented young professionals and enjoy the benefits of being at Xoxoday.
We are
Xoxoday is a rapidly growing fintech SaaS firm that propels business growth while focusing on human motivation. Backed by Giift and Apis Partners Growth Fund II, Xoxoday offers a suite of three products - Plum, Empuls, and Compass. Xoxoday works with more than 2000 clients across 10+ countries and over 2.5 million users. Headquartered in Bengaluru, Xoxoday is a 300+ strong team with four global offices in San Francisco, Dublin, Singapore, New Delhi.
Way forward
We look forward to connecting with you. As you may take time to review this opportunity, we will wait for a reasonable time of around 3-5 days before we screen the collected applications and start lining up job discussions with the hiring manager. We however assure you that we will attempt to maintain a reasonable time window for successfully closing this requirement. The candidates will be kept informed and updated on the feedback and application status.