50+ AWS (Amazon Web Services) Jobs in Bangalore (Bengaluru) | AWS (Amazon Web Services) Job openings in Bangalore (Bengaluru)
Apply to 50+ AWS (Amazon Web Services) Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest AWS (Amazon Web Services) Job opportunities across top companies like Google, Amazon & Adobe.
Job Description for Data Engineer Role:-
Must have:
Experience working with Programming languages. Solid foundational and conceptual knowledge is expected.
Experience working with Databases and SQL optimizations
Experience as a team lead or a tech lead and is able to independently drive tech decisions and execution and motivate the team in ambiguous problem spaces.
Problem Solving, Judgement and Strategic decisioning skills to be able to drive the team forward.
Role and Responsibilities:
- Share your passion for staying on top of tech trends, experimenting with and learning new technologies, participating in internal & external technology communities, mentoring other members of the engineering community, and from time to time, be asked to code or evaluate code
- Collaborate with digital product managers, and leaders from other team to refine the strategic needs of the project
- Utilize programming languages like Java, Python, SQL, Node, Go, and Scala, Open Source RDBMS and NoSQL databases
- Defining best practices for data validation and automating as much as possible; aligning with the enterprise standards
Qualifications -
- Experience with SQL and NoSQL databases.
- Experience with cloud platforms, preferably AWS.
- Strong experience with data warehousing and data lake technologies (Snowflake)
- Expertise in data modelling
- Experience with ETL/LT tools and methodologies
- Experience working on real-time Data Streaming and Data Streaming platforms
- 2+ years of experience in at least one of the following: Java, Scala, Python, Go, or Node.js
- 2+ years working with SQL and NoSQL databases, data modeling and data management
- 2+ years of experience with AWS, GCP, Azure, or another cloud service.
Job Overview:
We are seeking a skilled Senior Python Full Stack Developer with a strong background in application architecture and design. The ideal candidate will be proficient in Python, with extensive experience in web frameworks such as Django or Flask along with front-end technologies- React, JavaScript. You'll play a key role in designing scalable applications, collaborating with cross-functional teams, and leveraging cloud technologies.
Key Responsibilities:
- Backend Development:
- - Architect, develop, and maintain high-performance backend systems using Python or Golang.
- - Build and optimize APIs and microservices that power innovative, user-focused features.
- - Implement security and data protection measures that are scalable from day one.
- - Collaborate closely with DevOps to deploy and manage applications seamlessly in dynamic cloud environments.
- Frontend Development:
- - Work hand-in-hand with front-end developers to integrate and harmonize backend systems with React-based applications.
- - Contribute to the UI/UX design process, ensuring an intuitive, frictionless user experience that aligns with the startup’s vision.
- - Continuously optimize web applications to ensure they are fast, responsive, and scalable as the user base grows.
Qualifications:
- Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
- 7+ years of experience in backend development, with proficiency in Python and/or Golang.
- Strong experience in front-end technologies, particularly React.
- Familiarity with cloud platforms (AWS, GCP, or Azure) and containerization tools like Docker and Kubernetes.
- Knowledge of Apache Spark is highly preferred.
- Solid understanding of database technologies, both relational (e.g., PostgreSQL, MySQL) and NoSQL (e.g., MongoDB, Cassandra).
- Experience with CI/CD pipelines and automated testing frameworks.
- Excellent problem-solving skills and a proactive attitude toward tackling challenges
Hi,
We are looking for candidate with experience in DevSecOps.
Please find below JD for your reference.
Responsibilities:
Execute shell scripts for seamless automation and system management.
Implement infrastructure as code using Terraform for AWS, Kubernetes, Helm, kustomize, and kubectl.
Oversee AWS security groups, VPC configurations, and utilize Aviatrix for efficient network orchestration.
Contribute to Opentelemetry Collector for enhanced observability.
Implement microsegmentation using AWS native resources and Aviatrix for commercial routes.
Enforce policies through Open Policy Agent (OPA) integration.
Develop and maintain comprehensive runbooks for standard operating procedures.
Utilize packet tracing for network analysis and security optimization.
Apply OWASP tools and practices for robust web application security.
Integrate container vulnerability scanning tools seamlessly within CI/CD pipelines.
Define security requirements for source code repositories, binary repositories, and secrets managers in CI/CD pipelines.
Collaborate with software and platform engineers to infuse security principles into DevOps teams.
Regularly monitor and report project status to the management team.
Qualifications:
Proficient in shell scripting and automation.
Strong command of Terraform, AWS, Kubernetes, Helm, kustomize, and kubectl.
Deep understanding of AWS security practices, VPC configurations, and Aviatrix.
Familiarity with Opentelemetry for observability and OPA for policy enforcement.
Experience in packet tracing for network analysis.
Practical application of OWASP tools and web application security.
Integration of container vulnerability scanning tools within CI/CD pipelines.
Proven ability to define security requirements for source code repositories, binary repositories, and secrets managers in CI/CD pipelines.
Collaboration expertise with DevOps teams for security integration.
Regular monitoring and reporting capabilities.
Site Reliability Engineering experience.
Hands-on proficiency with source code management tools, especially Git.
Cloud platform expertise (AWS, Azure, or GCP) with hands-on experience in deploying and managing applications.
Please send across your updated profile.
SeekLMS, a flagship SaaS platform of CloodOn, is hiring for a Software Engineer who will be responsible for the design and development of its key components.
Roles and responsibilities
- Design, enhance and implement scalable, reliable, and maintainable technologies that run our product
- Distill business requirements into design specifications
- Writing reusable, testable, and efficient code
- Ensure that our product is carrier-grade in terms of user experience, reliability, scalability, and performance
Required Qualifications
- B.Tech in Computer Science or a related field.
- 2+ years designing/developing large-scale internet software systems.
- Expertise in a variety of Python-based tools and libraries like Django, Flask or FastAPI, mod_wsgi
- Expertise in a variety of server-side tools and technologies: Linux, Apache, Memcached, Varnish, MySQL, ORMs
- Experience in developing scalable web Infrastructure serving high-volume traffic
- Expertise in core web languages and protocols: HTTP, HTML, REST, JavaScript, JSON.
- Familiarity with front-end technologies, such as Javascript, HTML5, CSS is a plus
- Experience working with Amazon’s AWS (EC2, CloudFront, ELB, S3, Boto, AutoScaling) is a plus
- Passionate about building products in a fast-paced startup environment
- Strong empathy for users and customers.
- Strong technical documentation and presentation skills.
Client is excelled in providing top-notch business solutions to industries such as Ecommerce, Marketing, Banking and Finance, Insurance, Transport and many more. For a generation that is driven by data, Insights & decision making, we help businesses to make the best possible use of data and enable businesses to thrive in this competitive space. Our expertise spans across Data, Analytics and Engineering to name a few.
Experience to enhance our cloud infrastructure and optimize application performance.
- Bachelor’s degree in Computer Science or related field.
- 5+ years of DevOps experience with strong scripting skills (shell, Python, Ruby).
- Familiarity with open-source technologies and application development methodologies.
- Experience in optimizing both stand-alone and distributed systems.
Key Responsiblities
- Design and maintain DevOps practices for seamless application deployment.
- Utilize AWS tools (EBS, S3, EC2) and automation technologies (Ansible, Terraform).
- Manage Docker containers and Kubernetes environments.
- Implement CI/CD pipelines with tools like Jenkins and GitLab.
- Use monitoring tools (Datadog, Prometheus) for system reliability.
- Collaborate effectively across teams and articulate technical choices.
A leading Data & Analytics intelligence technology solutions provider to companies that value insights from information as a competitive advantage.
What are we looking for?
- Bachelor's degree in computer science, computer engineering, or related field.
- 4+ years of experience as a Python developer.
- Expert knowledge of Python and related frameworks including FastAPI/Flask/Django
- Experience in designing, developing, and managing cloud-based (AWS/GCP) infrastructure and applications.
- Good to have knowledge on Docker & CI-CD pipelines
- Good understanding of relational databases (e.g., MySQL, PostgreSQL)
- Ability to integrate multiple data sources into a single system.
- Ability to collaborate on projects and work independently when required.
- Excellent problem solving and communication abilities, to be able to solve complex problems that may arise during the development process.
Roles & Responsibilities:
- Developing applications using the python programming language.
- Involvement in all aspects of the software development life cycle, from requirements gathering to testing and deployment.
- Writing clean, scalable & efficient code
- Integrating user-facing elements developed by front-end developers with server-side logic
- Building reusable code libraries for future use
- Working closely with other members of the development team, as well as customers or clients to ensure that applications are developed according to specifications.
- Testing applications thoroughly before deployment in order to ensure that they are free of errors.
- Deploying applications and providing support after deployment, if necessary.
- Assisting senior developers in mentoring junior staff members
- Updating software programs as new versions become available.
Join an innovative and groundbreaking cybersecurity startup focused on helping customers identify, mitigate, and protect against ever-evolving cyber threats. With the current geopolitical climate, organizations need to stay ahead of malicious threat actors as well as nation-state actors. Cybersecurity teams are getting overwhelmed, and they need intelligent systems to help them focus on addressing the biggest and current risks first.
We help organizations protect their assets and customer data by continuously evaluating the new threats and risks to their cloud environment. This will, in turn, help mitigate the high-priority threats quickly so that the engineers can spend more time innovating and providing value to their customers.
About the Engineering Team:
We have several decades of experience working in the security industry, having worked on some of the most cutting-edge security technology that helped protect millions of customers. We have built technologies from the ground up, partnered with the industry on innovation, and helped customers with some of the most stringent requirements. We leverage industry and academic experts and veterans for their unique insight. Security technology includes all facets of software engineering work from data analytics and visualization, AI/ML processing, highly distributed and available services with real-time monitoring, integration with various other services, including protocol-level work. You will be learning from some of the best engineering talent with multi-cloud expertise.
We are looking for a highly experienced Principal Software Engineer to lead the development and scaling of our backend systems. The ideal candidate will have extensive experience in distributed systems, database management, Kubernetes, and cloud technologies. As a key technical leader, you will design, implement, and optimize critical backend services, working closely with cross-functional teams to ensure system reliability, scalability, and performance.
Key Responsibilities:
- Architect and Develop Distributed Systems: Design and implement scalable, distributed systems using microservices architecture. Expertise in both synchronous (REST/gRPC) and asynchronous communication patterns (message queues, Kafka), with a strong emphasis on building resilient services that can handle large data and maintain high throughput. Craft cloud solutions tailored to specific needs, choosing appropriate AWS services and optimizing resource utilization to ensure performance and high availability.
- Database Architecture & Optimization: Lead efforts to design and manage databases with a focus on scaling, replication, query optimization, and managing large datasets.
- Performance & Reliability: Engage in continuous learning and innovation to improve customer satisfaction. Embrace accountability and respond promptly to service issues to maintain and enhance system health. Ensure the backend systems meet high standards for performance, reliability, and scalability, identifying and solving bottlenecks and architectural challenges by leveraging various observability tools (such as Prometheus and Grafana).
- Leadership & Mentorship: Provide technical leadership and mentorship to other engineers, guiding architecture decisions, reviewing code, and helping to build a strong engineering culture. Stay abreast of the latest industry trends in cloud technology, adopting best practices to continuously improve our services and security measures.
Key Qualifications:
- Experience: 10+ years of experience in backend engineering, with at least 5 years of experience in building distributed systems.
- Technical Expertise:
- Distributed Systems: Extensive experience with microservices architecture, working with both synchronous (REST, gRPC) and asynchronous patterns (SNS, SNQ). Strong understanding of service-to-service authentication and authorization, API rate limiting, and other critical aspects of scalable systems.
- Database: Expertise in database technologies with experience working with large datasets, optimizing queries, handling replication, and creating views for performance. Hands-on experience with relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., DynamoDB, Cassandra). Expertise in various database technologies and deep experience with creating data models that provide consistent data views to the customer while data is being morphed, handling data migrations, and ensuring data integrity and high availability.
- Kubernetes: In-depth knowledge of Kubernetes, with experience deploying and managing services in Kubernetes clusters (EKS, AKS). Strong understanding of pods, services, networking, and scaling applications within Kubernetes environments.
- Golang: Proven experience using Golang as the primary programming language for backend development. Deep understanding of concurrency, performance optimization, and scalability in Golang applications.
- Cloud Technologies: Strong hands-on experience with AWS services (EC2, S3, DynamoDB, Lambda, RDS, EKS). Experience in designing and optimizing cloud-based architectures for large-scale distributed systems.
- Problem Solver: Strong problem-solving and debugging skills, with a proven ability to design and optimize complex systems.
- Leadership: Experience in leading engineering teams, guiding architectural decisions, and mentoring junior engineers.
Preferred Skills:
- Experience with infrastructure as code (Terraform, CloudFormation).
- Knowledge of GitHub-based CI/CD tools and best practices.
- Experience with monitoring and logging tools (Prometheus, Grafana, ELK).
- Cybersecurity experience.
🚀 We're Hiring: Python AWS Fullstack Developer at InfoGrowth! 🚀
Join InfoGrowth as a Python AWS Fullstack Developer and be a part of our dynamic team driving innovative cloud-based solutions!
Job Role: Python AWS Fullstack Developer
Location: Bangalore & Pune
Mandatory Skills:
- Proficiency in Python programming.
- Hands-on experience with AWS services and migration.
- Experience in developing cloud-based applications and pipelines.
- Familiarity with DynamoDB, OpenSearch, and Terraform (preferred).
- Solid understanding of front-end technologies: ReactJS, JavaScript, TypeScript, HTML, and CSS.
- Experience with Agile methodologies, Git, CI/CD, and Docker.
- Knowledge of Linux (preferred).
Preferred Skills:
- Understanding of ADAS (Advanced Driver Assistance Systems) and automotive technologies.
- AWS Certification is a plus.
Why Join InfoGrowth?
- Work on cutting-edge technology in a fast-paced environment.
- Collaborate with talented professionals passionate about driving change in the automotive and tech industries.
- Opportunities for professional growth and development through exciting projects.
🔗 Apply Now to elevate your career with InfoGrowth and make a difference in the automotive sector!
Roles and Responsibilities:
- Python Full Stack Development experience is preferred.
- Able to create modern data pipelines and data processing using AWS PAAS components (Glue, Sagemaker studio, etc.) or open source tools (Spark, Hbase, Hive, etc.).
- A good understanding of ML/ AI algorithms and statistical algorithms is mandatory.
- Critical thinking and problem-solving is essential
- Ability to model and design modern data structures, SQL/NoSQL databases, Data Lakes, Cloud Data Warehouse (SnowFlake preferred).
- Experience on data stores such as DynamoDB, Redis, Elasticsearch, MySQL, Oracle, and AWS RDS.
- Deploying software using CI/CD tools such as Azure Devops, Jenkins, etc.
- Experience with API tools such as REST, Swagger, Postman.
- Working in Agile Framework.
- Previous work experience in a fintech/finance product-based industry is a bonus.
- Previous work on the Churn models is a bonus.
a leading data & analytics intelligence technology solutions provider to companies that value insights from information as a competitive advantage
Roles & Responsibilities:
- Develop and maintain mobile-responsive web applications using React.
- Collaborate with our UI/UX designers to translate design wireframes into responsive web applications.
- Ensure web applications function flawlessly on various web browsers and platforms.
- Implement performance optimizations to enhance the mobile user experience.
- Proven experience as a Mobile Responsive Web Developer or a similar role is a must.
- Knowledge of web performance optimization and browser compatibility.
- Excellent problem-solving skills and attention to detail.
What are we looking for?
- 4+ years’ experience as a Front-End developer with hands on experience in React.js & Redux
- Experience as a UI/UX designer.
- Familiar with cloud infrastructure (Azure, AWS, or Google Cloud Services).
- Expert knowledge of CSS, CSS extension languages (Less, Sass), and CSS preprocessor tools.
- Expert knowledge of HTML5 and its best practices.
- Proficiency in designing interfaces and building clickable prototypes.
- Experience with Test Driven Development and Acceptance Test Driven Development.
- Proficiency using version control tools
- Effective communication and teamwork skills.
Key Responsibilities
• Lead the automation testing effort of our cloud management platform.
• Create and maintain automation test cases and test suites.
• Work closely with the development team to ensure that the automation tests are integrated into the development process.
• Collaborate with other QA team members to identify and resolve defects.
• Implement automation testing best practices and continuously improve the automation testing framework.
• Develop and maintain automation test scripts using programming languages such as Python.
• Conduct performance testing using tools such as JMeter, Gatling, or Locust.
• Monitor and report on automation testing and performance testing progress and results.
• Ensure that the automation testing and performance testing strategy aligns with overall product quality goals and objectives.
• Manage and mentor a team of automation QA engineers.
Requirements
• Bachelor's degree in Computer Science or a related field.
• Minimum of 8+ years of experience in automation testing and performance testing.
• Experience in leading and managing automation testing teams.
• Strong experience with automation testing frameworks including Robot Framework.
• Strong experience with programming languages, including Python.
• Strong understanding of software development lifecycle and agile methodologies.
• Experience with testing cloud-based applications.
• Good understanding of Cloud services & ecosystem, specifically AWS.
• Experience with performance testing tools such as JMeter, Gatling, or Locust.
• Excellent analytical and problem-solving skills.
• Excellent written and verbal communication skills.
• Ability to work independently and in a team environment.
• Passionate about automation testing and performance testing.
Senior DevOps Engineer
Experience: Minimum 5 years of relevant experience
Key Responsibilities:
• Hands-on experience with AWS tools and CI/CD pipelines, Redhat Linux
• Strong expertise in DevOps practices and principles
• Experience with infrastructure automation and configuration management
• Excellent problem-solving skills and attention to detail
Nice to Have:
• Redhat certification
is a leading Data & Analytics intelligence technology solutions provider to companies that value insights from information as a competitive advantage. We are the partner of choice for enterprises on their digital transformation journey. Our teams offer solutions and services at the intersection of Advanced Data, Analytics, and AI.
Skills: ITSM methodologies, Python, Snowflake and AWS. Open for 18*5 support as well.
Immediate Joiner - 30 days NP
• Bachelor’s degree in computer science, Software Engineering, or a related field.
• Should have hands on 5+ Experience in ITSM methodologies
• 3+ Years of experience in SQL, Snowflake, Python development
• 2+ years hands-on experience in Snowflake DW
• Good communication and client/stakeholders’ management skill
• Willing to work across multiple time-zone and handled team based out of off - shore
Backend Developer
Requirements:
- Proficiency in MySQL, AWS, Git, PHP, HTML.
- Minimum 2 years of experience in Laravel framework.
- Minimum 3 years of experience in PHP development.
- Overall professional experience of 3+ years.
- Basic knowledge of JavaScript, TypeScript, Node.js, and Express framework.
- Education: Graduation with an aggregate of 70%.
Roles and Responsibilities:
- The primary role will be development, quality check and maintenance of the platform to ensure improvement and stability.
- Contribute to the development of effective functions and systems that can meet the overall objectives of the company.
- Understanding of performance engineering and optimization.
- Ability to design and code complex programs.
a leading Data & Analytics intelligence technology solutions provider to companies that value insights from information as a competitive advantage. We are the partner of choice for enterprises on their digital transformation journey. Our teams offer solutions and services at the intersection of Advanced Data, Analytics, and AI.
Skills: Python, Fast API, AWS/GCP/Azure'
Location - Bangalore / Mangalore (Hybrid)
NP - Immediate - 20 days
• Experience in building python-based utility-scale Enterprise APIs with QoS/SLA based specs building upon Cloud APIs from GCP, AWS and Azure
• Exposure to Multi-modal (Text, Audio, Video) development in synchronous and Batch mode in high-volume use-cases leveraging Queuing, Pooling and enterprise scaling patterns. • Solid understanding of API life cycle including versioning (ex: parallel deployment of multiple versions), exception management.
• Working experience (Development and/or troubleshooting skills) of Enterprise scale AWS, CI/CD leveraging GitHub actions-based workflow.
• Solid knowledge of developing/updating enterprise Cloud formation templates for python centric code-assets along with quality/security tooling
• Design/support tracing/monitoring capability (X-Ray, AWS distro for Open Telemetry) for Fargate services. • Responsible and able to communicate requirements.
Position Overview: We are seeking a talented and experienced Cloud Engineer specialized in AWS cloud services to join our dynamic team. The ideal candidate will have a strong background in AWS infrastructure and services, including EC2, Elastic Load Balancing (ELB), Auto Scaling, S3, VPC, RDS, CloudFormation, CloudFront, Route 53, AWS Certificate Manager (ACM), and Terraform for Infrastructure as Code (IaC). Experience with other AWS services is a plus.
Responsibilities:
• Design, deploy, and maintain AWS infrastructure solutions, ensuring scalability, reliability, and security.
• Configure and manage EC2 instances to meet application requirements.
• Implement and manage Elastic Load Balancers (ELB) to distribute incoming traffic across multiple instances.
• Set up and manage AWS Auto Scaling to dynamically adjust resources based on demand.
• Configure and maintain VPCs, including subnets, route tables, and security groups, to control network traffic.
• Deploy and manage AWS CloudFormation and Terraform templates to automate infrastructure provisioning using Infrastructure as Code (IaC) principles.
• Implement and monitor S3 storage solutions for secure and scalable data storage
• Set up and manage CloudFront distributions for content delivery with low latency and high transfer speeds.
• Configure Route 53 for domain management, DNS routing, and failover configurations.
• Manage AWS Certificate Manager (ACM) for provisioning, managing, and deploying SSL/TLS certificates.
• Collaborate with cross-functional teams to understand business requirements and provide effective cloud solutions.
• Stay updated with the latest AWS technologies and best practices to drive continuous improvement.
Qualifications:
• Bachelor's degree in computer science, Information Technology, or a related field.
• Minimum of 2 years of relevant experience in designing, deploying, and managing AWS cloud solutions.
• Strong proficiency in AWS services such as EC2, ELB, Auto Scaling, VPC, S3, RDS, and CloudFormation.
• Experience with other AWS services such as Lambda, ECS, EKS, and DynamoDB is a plus.
• Solid understanding of cloud computing principles, including IaaS, PaaS, and SaaS.
• Excellent problem-solving skills and the ability to troubleshoot complex issues in a cloud environment.
• Strong communication skills with the ability to collaborate effectively with cross-functional teams.
• Relevant AWS certifications (e.g., AWS Certified Solutions Architect, AWS Certified DevOps Engineer, etc.) are highly desirable.
Additional Information:
• We value creativity, innovation, and a proactive approach to problem-solving.
• We offer a collaborative and supportive work environment where your ideas and contributions are valued.
• Opportunities for professional growth and development. Someshwara Software Pvt Ltd is an equal opportunity employer.
We celebrate diversity and are dedicated to creating an inclusive environment for all employees.
- Responsible for designing, storing, processing, and maintaining of large-scale data and related infrastructure.
- Can drive multiple projects both from operational and technical standpoint.
- Ideate and build PoV or PoC for new product that can help drive more business.
- Responsible for defining, designing, and implementing data engineering best practices, strategies, and solutions.
- Is an Architect who can guide the customers, team, and overall organization on tools, technologies, and best practices around data engineering.
- Lead architecture discussions, align with business needs, security, and best practices.
- Has strong conceptual understanding of Data Warehousing and ETL, Data Governance and Security, Cloud Computing, and Batch & Real Time data processing
- Has strong execution knowledge of Data Modeling, Databases in general (SQL and NoSQL), software development lifecycle and practices, unit testing, functional programming, etc.
- Understanding of Medallion architecture pattern
- Has worked on at least one cloud platform.
- Has worked as data architect and executed multiple end-end data engineering project.
- Has extensive knowledge of different data architecture designs and data modelling concepts.
- Manages conversation with the client stakeholders to understand the requirement and translate it into technical outcomes.
Required Tech Stack
- Strong proficiency in SQL
- Experience working on any of the three major cloud platforms i.e., AWS/Azure/GCP
- Working knowledge of an ETL and/or orchestration tools like IICS, Talend, Matillion, Airflow, Azure Data Factory, AWS Glue, GCP Composer, etc.
- Working knowledge of one or more OLTP databases (Postgres, MySQL, SQL Server, etc.)
- Working knowledge of one or more Data Warehouse like Snowflake, Redshift, Azure Synapse, Hive, Big Query, etc.
- Proficient in at least one programming language used in data engineering, such as Python (or Scala/Rust/Java)
- Has strong execution knowledge of Data Modeling (star schema, snowflake schema, fact vs dimension tables)
- Proficient in Spark and related applications like Databricks, GCP DataProc, AWS Glue, EMR, etc.
- Has worked on Kafka and real-time streaming.
- Has strong execution knowledge of data architecture design patterns (lambda vs kappa architecture, data harmonization, customer data platforms, etc.)
- Has worked on code and SQL query optimization.
- Strong knowledge of version control systems like Git to manage source code repositories and designing CI/CD pipelines for continuous delivery.
- Has worked on data and networking security (RBAC, secret management, key vaults, vnets, subnets, certificates)
NASDAQ listed, Service Provider IT Company
Job Summary:
As a Cloud Architect at organization, you will play a pivotal role in designing, implementing, and maintaining our multi-cloud infrastructure. You will work closely with various teams to ensure our cloud solutions are scalable, secure, and efficient across different cloud providers. Your expertise in multi-cloud strategies, database management, and microservices architecture will be essential to our success.
Key Responsibilities:
- Design and implement scalable, secure, and high-performance cloud architectures across multiple cloud platforms (AWS, Azure, Google Cloud Platform).
- Lead and manage cloud migration projects, ensuring seamless transitions between on-premises and cloud environments.
- Develop and maintain cloud-native solutions leveraging services from various cloud providers.
- Architect and deploy microservices using REST, GraphQL to support our application development needs.
- Collaborate with DevOps and development teams to ensure best practices in continuous integration and deployment (CI/CD).
- Provide guidance on database architecture, including relational and NoSQL databases, ensuring optimal performance and security.
- Implement robust security practices and policies to protect cloud environments and data.
- Design and implement data management strategies, including data governance, data integration, and data security.
- Stay-up-to-date with the latest industry trends and emerging technologies to drive continuous improvement and innovation.
- Troubleshoot and resolve cloud infrastructure issues, ensuring high availability and reliability.
- Optimize cost and performance across different cloud environments.
Qualifications/ Experience & Skills Required:
- Bachelor's degree in Computer Science, Information Technology, or a related field.
- Experience: 10 - 15 Years
- Proven experience as a Cloud Architect or in a similar role, with a strong focus on multi-cloud environments.
- Expertise in cloud migration projects, both lift-and-shift and greenfield implementations.
- Strong knowledge of cloud-native solutions and microservices architecture.
- Proficiency in using GraphQL for designing and implementing APIs.
- Solid understanding of database technologies, including SQL, NoSQL, and cloud-based database solutions.
- Experience with DevOps practices and tools, including CI/CD pipelines.
- Excellent problem-solving skills and ability to troubleshoot complex issues.
- Strong communication and collaboration skills, with the ability to work effectively in a team environment.
- Deep understanding of cloud security practices and data protection regulations (e.g., GDPR, HIPAA).
- Experience with data management, including data governance, data integration, and data security.
Preferred Skills:
- Certifications in multiple cloud platforms (e.g., AWS Certified Solutions Architect, Google Certified Professional Cloud Architect, Microsoft Certified: Azure Solutions Architect).
- Experience with containerization technologies (Docker, Kubernetes).
- Familiarity with cloud cost management and optimization tools.
at Majoris Technologies
One of our premium-based product customers, we are looking to hire a team of software Developers in Bangalore, looking for Tech Geeks, who have 2+ years of experience full-time.
SENIOR SOFTWARE DEVELOPMENT ENGINEER - FRONTEND
● Overall, 2-4 years of experience.
● Solid foundations of Javascript, Responsive web, CSS, Semantic HTML and how the
internet works.
● Strong proficiency with React and its core principles. React Native is a plus.
● Solid understanding of Chrome dev tools, APIs and frontend performance.
● Working knowledge of GitHub and popular cloud platforms like AWS, Fly, Cloudflare,
etc.
● Understanding of software design patterns, high-level design and architecture.
● Ability to independently do LLD and technology exploration around a given problem
statement.
at REConnect Energy
Work at the Intersection of Energy, Weather & Climate Sciences and Artificial Intelligence
About the company:
REConnect Energy is India's largest tech-enabled service provider in predictive analytics and demand-supply aggregation for the energy sector. We focus on digital intelligence for climate resilience, offering solutions for efficient asset and grid management, minimizing climate-induced risks, and providing real-time visibility of assets and resources.
Responsibilities:
- Design, develop, and maintain data engineering pipelines using Python.
- Implement and optimize database solutions with SQL and NOSQL Databases (MySQL and MongoDB).
- Perform data analysis, profiling, and quality assurance to ensure high service quality standards.
- Troubleshoot and resolve data-pipeline related issues, ensuring optimal performance and reliability.
- Collaborate with cross-functional teams to understand business requirements and translate them into technical specifications.
- Participate in code reviews and contribute to the continuous improvement of the codebase.
- Utilize GitHub for version control and collaboration.
- Implement and manage containerization solutions using Docker.
- Implement tech solutions to new product development, ensuring scalability, performance, and security.
Requirements:
- Bachelors or Master’s degree in Computer Science, Software Engineering, Electrical Engineering or equivalent.
- Proficient in Python programming skills and expertise with data engineering.
- Experience in databases including MySQL and NoSQL.
- Experience in developing and maintaining critical and high availability systems will be given strong preference.
- Experience working with AWS cloud platform.
- Strong analytical and data-driven approach to problem solving.
Candidate Profile -
- Should have worked either as a Key Account Manager with E-Com platform/ P&L Lead
- Good Understanding of the Amazon ecosystem and managing AMS campaigns
- Proven work experience in E-Com of at least 3-4 yrs. Proficient in P&L Management, operations, and data analytics
- Entrepreneurial mindset to get the job done single-handedly. Strong Analytical as well as Communication skills
Role Description -
This is a crucial role spanning across top e-commerce customers. A key responsibility of this role is to own, manage, and build the P&L for the channel.
1)This role would be responsible for driving key projects and delivering the AOP for the MP team aligning with various internal stakeholders such as KAMs, category managers, supply chain, and finance teams.
2) Develop strategic plans for our brand in line with business and sales objectives.
3) Complete responsibility for volume forecasts, supply chain efficiency, and account reconciliations.
4) Deliver and exceed on growth, revenue, and market share objectives
What you'll do:
Be a part of the initial team to define and set up a best-in-class digital platform for the Private Credit industry, and take full ownership of the components of the digital platform
Develop responsive and visually appealing user interfaces for web applications using React.js .
Building reusable components and front-end libraries for future use
Translating designs and wireframes into high quality code
Optimising components for maximum performance across a vast array of web-capable devices and browsers
Knowledge of bundling tools and dynamic loading of components.
Stay up-to-date with the latest web development trends, tools, and technologies.
Share/enhance insights and talent in a collaborative work environment responsible for building quick prototypes
What makes you a great fit:
3-5 years of experience developing UI applications
Must be proficient with ReactJS
Knowledge in one of react frameworks with server components is an added advantage like Next. JS, Remix.
Experience with popular React.js state management libraries (such as Redux or Zustand)
Strong proficiency in JavaScript, including DOM manipulation and the JavaScript object model
Must be proficient with HTML/CSS and responsiveness.
Exposure to a variety of UI frameworks is a strong plus, specifically AngularJS and/or EmberJS
Experience with TypeScript a plus
Experience with testing frameworks such as JEST is a plus
Work closely with the designers, backend engineers to build a holistic user experience. Help define and interact with REST based API's.
Exposure to Web Security, Microservices, AWS components, and Cloud concepts a plus
Strong attention to detail, with an eye toward pixel-perfection
Strong analytical and problem solving skills.
Ability to work effectively in a fast-paced startup environment and adapt to changing priorities.
About Us:
Binocs.co empowers institutional lenders with next-generation loan management software, streamlining the entire loan lifecycle and facilitating seamless interaction among stakeholders.
Team: Binocs.co is led by a passionate team with extensive experience in financial technology, lending, AI, and software development.
Investors: Our journey is backed by renowned investors who share our vision for transforming the loan management landscape: Beenext, Arkam Ventures, Accel, Saison Capital, Blume Ventures, Premji Invest, and Better Capital.
the forefront of innovation in the digital video industry
Responsibilities:
- Work with development teams and product managers to ideate software solutions
- Design client-side and server-side architecture
- Creating a well-informed cloud strategy and managing the adaptation process
- Evaluating cloud applications, hardware, and software
- Develop and manage well-functioning databases and applications Write effective APIs
- Participate in the entire application lifecycle, focusing on coding and debugging
- Write clean code to develop, maintain and manage functional web applications
- Get feedback from, and build solutions for, users and customers
- Participate in requirements, design, and code reviews
- Engage with customers to understand and solve their issues
- Collaborate with remote team on implementing new requirements and solving customer problems
- Focus on quality of deliverables with high accountability and commitment to program objectives
Required Skills:
- 7– 10 years of SW development experience
- Experience using Amazon Web Services (AWS), Microsoft Azure, Google Cloud, or other major cloud computing services.
- Strong skills in Containers, Kubernetes, Helm
- Proficiency in C#, .NET, PHP /Java technologies with an acumen for code analysis, debugging and problem solving
- Strong skills in Database Design(PostgreSQL or MySQL)
- Experience in Caching and message Queue
- Experience in REST API framework design
- Strong focus on high-quality and maintainable code
- Understanding of multithreading, memory management, object-oriented programming
Preferred skills:
- Experience in working with Linux OS
- Experience in Core Java programming
- Experience in working with JSP/Servlets, Struts, Spring / Spring Boot, Hibernate
- Experience in working with web technologies HTML,CSS
- Knowledge of source versioning tools particularly JIRA, Git, Stash, and Jenkins.
- Domain Knowledge of Video, Audio Codecs
Job Purpose and Impact
The DevOps Engineer is a key position to strengthen the security automation capabilities which have been identified as a critical area for growth and specialization within Global IT’s scope. As part of the Cyber Intelligence Operation’s DevOps Team, you will be helping shape our automation efforts by building, maintaining and supporting our security infrastructure.
Key Accountabilities
- Collaborate with internal and external partners to understand and evaluate business requirements.
- Implement modern engineering practices to ensure product quality.
- Provide designs, prototypes and implementations incorporating software engineering best practices, tools and monitoring according to industry standards.
- Write well-designed, testable and efficient code using full-stack engineering capability.
- Integrate software components into a fully functional software system.
- Independently solve moderately complex issues with minimal supervision, while escalating more complex issues to appropriate staff.
- Proficiency in at least one configuration management or orchestration tool, such as Ansible.
- Experience with cloud monitoring and logging services.
Qualifications
Minimum Qualifications
- Bachelor's degree in a related field or equivalent exp
- Knowledge of public cloud services & application programming interfaces
- Working exp with continuous integration and delivery practices
Preferred Qualifications
- 3-5 years of relevant exp whether in IT, IS, or software development
- Exp in:
- Code repositories such as Git
- Scripting languages (Python & PowerShell)
- Using Windows, Linux, Unix, and mobile platforms within cloud services such as AWS
- Cloud infrastructure as a service (IaaS) / platform as a service (PaaS), microservices, Docker containers, Kubernetes, Terraform, Jenkins
- Databases such as Postgres, SQL, Elastic
Job Title: Data Engineer
Cargill’s size and scale allows us to make a positive impact in the world. Our purpose is to nourish the world in a safe, responsible and sustainable way. We are a family company providing food, ingredients, agricultural solutions and industrial products that are vital for living. We connect farmers with markets so they can prosper. We connect customers with ingredients so they can make meals people love. And we connect families with daily essentials — from eggs to edible oils, salt to skincare, feed to alternative fuel. Our 160,000 colleagues, operating in 70 countries, make essential products that touch billions of lives each day. Join us and reach your higher purpose at Cargill.
Job Purpose and Impact
As a Data Engineer at Cargill you work across the full stack to design, develop and operate high performance and data centric solutions using our comprehensive and modern data capabilities and platforms. You will play a critical role in enabling analytical insights and process efficiencies for Cargill’s diverse and complex business environments. You will work in a small team that shares your passion for building innovative, resilient, and high-quality solutions while sharing, learning and growing together.
Key Accountabilities
Collaborate with business stakeholders, product owners and across your team on product or solution designs.
· Develop robust, scalable and sustainable data products or solutions utilizing cloud-based technologies.
· Provide moderately complex technical support through all phases of product or solution life cycle.
· Perform data analysis, handle data modeling, and configure and develop data pipelines to move and optimize data assets.
· Build moderately complex prototypes to test new concepts and provide ideas on reusable frameworks, components and data products or solutions and help promote adoption of new technologies.
· Independently solve moderately complex issues with minimal supervision, while escalating more complex issues to appropriate staff.
· Other duties as assigned
Qualifications
MINIMUM QUALIFICATIONS
· Bachelor’s degree in a related field or equivalent experience
· Minimum of two years of related work experience
· Other minimum qualifications may apply
PREFERRED QUALIFCATIONS
· Experience developing modern data architectures, including data warehouses, data lakes, data meshes, hubs and associated capabilities including ingestion, governance, modeling, observability and more.
· Experience with data collection and ingestion capabilities, including AWS Glue, Kafka Connect and others.
· Experience with data storage and management of large, heterogenous datasets, including formats, structures, and cataloging with such tools as Iceberg, Parquet, Avro, ORC, S3, HFDS, HIVE, Kudu or others.
· Experience with transformation and modeling tools, including SQL based transformation frameworks, orchestration and quality frameworks including dbt, Apache Nifi, Talend, AWS Glue, Airflow, Dagster, Great Expectations, Oozie and others
· Experience working in Big Data environments including tools such as Hadoop and Spark
· Experience working in Cloud Platforms including AWS, GCP or Azure
· Experience of streaming and stream integration or middleware platforms, tools, and architectures such as Kafka, Flink, JMS, or Kinesis.
· Strong programming knowledge of SQL, Python, R, Java, Scala or equivalent
· Proficiency in engineering tooling including docker, git, and container orchestration services
· Strong experience of working in devops models with demonstratable understanding of associated best practices for code management, continuous integration, and deployment strategies.
· Experience and knowledge of data governance considerations including quality, privacy, security associated implications for data product development and consumption.
Equal Opportunity Employer, including Disability/Vet.
Job Description:
- Experience in Core Java, Spring Boot.
- Experience in microservices.
- Extensive experience in developing enterprise-scale systems for global organization. Should possess good architectural knowledge and be aware of enterprise application design patterns.
- Should be able to analyze, design, develop and test complex, low-latency client-facing applications.
- Good development experience with RDBMS in SQL Server, Postgres, Oracle or DB2
- Good knowledge of multi-threading
- Basic working knowledge of Unix/Linux
- Excellent problem solving and coding skills in Java
- Strong interpersonal, communication and analytical skills.
- Should be able to express their design ideas and thoughts.
About Wissen Technology:
Wissen Technology is a niche global consulting and solutions company that brings unparalleled domain expertise in Banking and Finance, Telecom and Startups. Wissen Technology is a part of Wissen Group and was established in the year 2015. Wissen has offices in the US, India, UK, Australia, Mexico, and Canada, with best-in class infrastructure and development facilities. Wissen has successfully delivered projects worth $1 Billion for more than 25 of the Fortune 500 companies. The Wissen Group overall includes more than 4000 highly skilled professionals.
Wissen Technology provides exceptional value in mission critical projects for its clients, through thought leadership, ownership, and assured on-time deliveries that are always ‘first time right’.
Our team consists of 1200+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world.
Wissen Technology offers an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation.
We have been certified as a Great Place to Work® for two consecutive years (2020-2022) and voted as the Top 20 AI/ML vendor by CIO Insider.
Job Description - Manager Sales
Min 15 years experience,
Should be experience from Sales of Cloud IT Saas Products portfolio which Savex deals with,
Team Management experience, leading cloud business including teams
Sales manager - Cloud Solutions
Reporting to Sr Management
Good personality
Distribution backgroung
Keen on Channel partners
Good database of OEMs and channel partners.
Age group - 35 to 45yrs
Male Candidate
Good communication
B2B Channel Sales
Location - Bangalore
If interested reply with cv and below details
Total exp -
Current ctc -
Exp ctc -
Np -
Current location -
Qualification -
Total exp Channel Sales -
What are the Cloud IT products, you have done sales for?
What is the Annual revenue generated through Sales ?
Experience: 5+ Years
• Experience in Core Java, Spring Boot
• Experience in microservices and angular
• Extensive experience in developing enterprise-scale systems for global organization. Should possess good architectural knowledge and be aware of enterprise application design patterns.
• Should be able to analyze, design, develop and test complex, low-latency client-facing applications.
• Good development experience with RDBMS in SQL Server, Postgres, Oracle or DB2
• Good knowledge of multi-threading
• Basic working knowledge of Unix/Linux
• Excellent problem solving and coding skills in Java
• Strong interpersonal, communication and analytical skills.
• Should be able to express their design ideas and thoughts
• 5 + years of hands-on experience in .NET, Angular and related Web technologies.
• 2 - 3+ years of experience designing and supporting Azure environments, including IaaS and PaaS
• Past experience in cloud computing and technology systems, as well as experience in designing and transferring applications to the cloud.
• Proven experience in Continuous Integration (CI) and Continuous Delivery (CD) as well as DevOps principles to achieve SDLC goals of Global Banking Technology
• Hands on experience using the Azure/AWS administration portal.
• Knowledge of at least one other cloud hosting solution (e.g., Azure, Google, AWS, Helion Cloud).
• Good hands-on experience in SOA, XML, WSDL, XSD, WSDL XML schema and namespaces (J2EE and .NET), MS .NET Framework, C#, HTML, Javascript, Micro Services/ APIs, Messaging, Threading, IBM DB.
• Have experience on development of container applications.
• Good knowledge of IBM DB. Should be comfortable writing stored procedures and user defined functions.
• Strong business knowledge of wealth management industry.
• Component / business object modeling, services modeling.
• Experience in building high concurrency, low latency 247 availability applications.
• Strong programming skills with emphasis on writing efficient algorithms to handle large data sets and processing.
• Understanding of HTTP, IIS and how the browser works.
• Ability to coordinate with various teams to deliver projects successfully.
• Knowledge of UML design.
• Knowledge of source control (preferably Git) and continuous Integration tools.
• Have working Knowledge of DevOps and best practices. Understanding of Major DevOps Tools and implementation for Source Control, Continuous Integration, Configuration Management, Deployment Automation, containers & Orchestration
• Experience in working in Agile framework.
• Good written and verbal communication skills.
Skills Desired:
• Financial markets, payment solutions or Wealth Management previous experience
Educational Qualification:
• Bachelor’s degree in Engineering preferred from an accredited college/university
- Bachelor of Computer Science or Equivalent Education
- At least 5 years of experience in a relevant technical position.
- Azure and/or AWS experience
- Strong in CI/CD concepts and technologies like GitOps (Argo CD)
- Hands-on experience with DevOps Tools (Jenkins, GitHub, SonarQube, Checkmarx)
- Experience with Helm Charts for package management
- Strong in Kubernetes, OpenShift, and Container Network Interface (CNI)
- Experience with programming and scripting languages (Spring Boot, NodeJS, Python)
- Strong container image management experience using Docker and distroless concepts
- Familiarity with Shared Libraries for code reuse and modularity
- Excellent communication skills (verbal, written, and presentation)
Note: Looking for immediate joiners only.
Job Description: Network Fresher
Role: Network Fresher
Experience: Fresher (Will be working as a Trainee for 1 year)
Location: Bangalore
Notice Period: Immediate
Shift Timings and Working Days: Rotational Shifts & 6 Days working (complete work from office)
Current Location: Candidates must be currently located in Bangalore
Required Skills:
- Basic understanding of networking concepts and protocols.
- CCNA training is mandatory.
- Knowledge of Linux, server management, AWS, and cloud computing.
- Strong analytical and problem-solving skills.
- Ability to work in rotational shifts.
- Excellent verbal and written communication skills.
Educational Qualification:
- Graduation must (B.Tech / B.Sc / B.E / BCA). Candidates should have a provisional or passing certificate.
Greetings , Wissen Technology is Hiring for the position of Data Engineer
Please find the Job Description for your Reference:
JD
- Design, develop, and maintain data pipelines on AWS EMR (Elastic MapReduce) to support data processing and analytics.
- Implement data ingestion processes from various sources including APIs, databases, and flat files.
- Optimize and tune big data workflows for performance and scalability.
- Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions.
- Manage and monitor EMR clusters, ensuring high availability and reliability.
- Develop ETL (Extract, Transform, Load) processes to cleanse, transform, and store data in data lakes and data warehouses.
- Implement data security best practices to ensure data is protected and compliant with relevant regulations.
- Create and maintain technical documentation related to data pipelines, workflows, and infrastructure.
- Troubleshoot and resolve issues related to data processing and EMR cluster performance.
Qualifications:
- Bachelor’s degree in Computer Science, Information Technology, or a related field.
- 5+ years of experience in data engineering, with a focus on big data technologies.
- Strong experience with AWS services, particularly EMR, S3, Redshift, Lambda, and Glue.
- Proficiency in programming languages such as Python, Java, or Scala.
- Experience with big data frameworks and tools such as Hadoop, Spark, Hive, and Pig.
- Solid understanding of data modeling, ETL processes, and data warehousing concepts.
- Experience with SQL and NoSQL databases.
- Familiarity with CI/CD pipelines and version control systems (e.g., Git).
- Strong problem-solving skills and the ability to work independently and collaboratively in a team environment
Devops Engineer(Permanent)
Experience: 8 to 12 yrs
Location: Remote for 2-3 months (Any Mastek Location- Chennai/Mumbai/Pune/Noida/Gurgaon/Ahmedabad/Bangalore)
Max Salary = 28 LPA (including 10% variable)
Notice Period: Immediate/ max 10days
Mandatory Skills: Either Splunk/Datadog, Gitlab, Retail Domain
· Bachelor’s degree in Computer Science/Information Technology, or in a related technical field or equivalent technology experience.
· 10+ years’ experience in software development
· 8+ years of experience in DevOps
· Mandatory Skills: Either Splunk/Datadog,Gitalb,EKS,Retail domain experience
· Experience with the following Cloud Native tools: Git, Jenkins, Grafana, Prometheus, Ansible, Artifactory, Vault, Splunk, Consul, Terraform, Kubernetes
· Working knowledge of Containers, i.e., Docker Kubernetes, ideally with experience transitioning an organization through its adoption
· Demonstrable experience with configuration, orchestration, and automation tools such as Jenkins, Puppet, Ansible, Maven, and Ant to provide full stack integration
· Strong working knowledge of enterprise platforms, tools and principles including Web Services, Load Balancers, Shell Scripting, Authentication, IT Security, and Performance Tuning
· Demonstrated understanding of system resiliency, redundancy, failovers, and disaster recovery
· Experience working with a variety of vendor APIs including cloud, physical and logical infrastructure devices
· Strong working knowledge of Cloud offerings & Cloud DevOps Services (EC2, ECS, IAM, Lambda, Cloud services, AWS CodeBuild, CodeDeploy, Code Pipeline etc or Azure DevOps, API management, PaaS)
· Experience managing and deploying Infrastructure as Code, using tools like Terraform Helm charts etc.
· Manage and maintain standards for Devops tools used by the team
✓ Prior experience in multi-screen User Interface/User
Experience in product design and development.
✓ Very strong knowledge in
• Software design, design patterns, architecture
and best practice
• UI technologies and frameworks
o ES6 JavaScript / TypeScript
o ReactNative/ReactJS – including modern
concepts of Server rendering, React
server components, Server actions.
o Redux
o CSS - Responsive design
• Android/iOS native development experience
• Agile & DevOps methodologies (Scrum, SAFe
frameworks)
• Testing- Unit, Functional & performance
Experience
We have an exciting and rewarding opportunity for you to take your software engineering career to the next level.
As a Software Engineer III at JPMorgan Chase within the Asset & Wealth Management, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives.
Job responsibilities
- Executes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems
- Creates secure and high-quality production code and maintains algorithms that run synchronously with appropriate systems
- Produces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development
- Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems
- Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture
- Contributes to software engineering communities of practice and events that explore new and emerging technologies
- Adds to team culture of diversity, equity, inclusion, and respect
Required qualifications, capabilities, and skills
- Formal training or certification on software engineering concepts and 3+ years applied experience
- Expert level in the programming on Python. Experience designing and building APIs using popular frameworks such as Flask, Fast API
- Familiar with site reliability concepts, principles, and practices
- Experience maintaining a Cloud-base infrastructure
- Familiar with observability such as white and black box monitoring, service level objective alerting, and telemetry collection using tools such as Grafana, Dynatrace, Prometheus, Datadog, Splunk, and others
- Emerging knowledge of software, applications and technical processes within a given technical discipline (e.g., Cloud, artificial intelligence, Android, etc.)
- Emerging knowledge of continuous integration and continuous delivery tools (e.g., Jenkins, Jules, Spinnaker, BitBucket, GitLab, Terraform, etc.)
- Emerging knowledge of common networking technologies
Preferred qualifications, capabilities, and skills
- General knowledge of financial services industry
- Experience working on public cloud environment using wrappers and practices that are in use at JPMC
- Knowledge on Terraform, containers and container orchestration, especially Kubernetes preferred
Need Immediate Joiners only (4-5 days)
Overall experience 6+ years and relevant at least 2+ years in Golang with Java
● Strong design and architectural experience in building various highly-scalable and
highly-available products
● Strong understanding of the SDLC Activities which include Analysis, Design, Development,
Testing, Deployment and Post-Production Support etc.
● Proficiency in at least one server side framework for languages preferably Go Lang
● Experience working on NoSQL & SQL Databases such as MySQL, PostgreSQL, MongoDB,
Redis etc
● Deep Dive, problem-solving, RCA and systematic thinking to reach the cause of issues
● Able to work independently and multi-task effectively
● Program at a system level and able to manage service stability
● Excellent experience maintaining, scalable, extensible code
● Methodical in maintaining up to date documentation
● Metric-driven mindset and obsessive about ensuring clean coding practices
● Preferred experience in product development
● Preferred working experience on microservices platforms
● Proficiency in at least one modern web front-end development framework such as React JS
will be a bonus
Good to Have Skills:
● Preferred experience in Elasticsearch and Kibana (ELK Stack)
● Preferred experience with messaging systems like RabbitMQ
GCP Cloud Engineer:
- Proficiency in infrastructure as code (Terraform).
- Scripting and automation skills (e.g., Python, Shell). Knowing python is must.
- Collaborate with teams across the company (i.e., network, security, operations) to build complete cloud offerings.
- Design Disaster Recovery and backup strategies to meet application objectives.
- Working knowledge of Google Cloud
- Working knowledge of various tools, open-source technologies, and cloud services
- Experience working on Linux based infrastructure.
- Excellent problem-solving and troubleshooting skills
Publicis Sapient Overview:
The Senior Associate People Senior Associate L1 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution
.
Job Summary:
As Senior Associate L2 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution
The role requires a hands-on technologist who has strong programming background like Java / Scala / Python, should have experience in Data Ingestion, Integration and data Wrangling, Computation, Analytics pipelines and exposure to Hadoop ecosystem components. You are also required to have hands-on knowledge on at least one of AWS, GCP, Azure cloud platforms.
Role & Responsibilities:
Your role is focused on Design, Development and delivery of solutions involving:
• Data Integration, Processing & Governance
• Data Storage and Computation Frameworks, Performance Optimizations
• Analytics & Visualizations
• Infrastructure & Cloud Computing
• Data Management Platforms
• Implement scalable architectural models for data processing and storage
• Build functionality for data ingestion from multiple heterogeneous sources in batch & real-time mode
• Build functionality for data analytics, search and aggregation
Experience Guidelines:
Mandatory Experience and Competencies:
# Competency
1.Overall 5+ years of IT experience with 3+ years in Data related technologies
2.Minimum 2.5 years of experience in Big Data technologies and working exposure in at least one cloud platform on related data services (AWS / Azure / GCP)
3.Hands-on experience with the Hadoop stack – HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in building end to end data pipeline.
4.Strong experience in at least of the programming language Java, Scala, Python. Java preferable
5.Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc
6.Well-versed and working knowledge with data platform related services on at least 1 cloud platform, IAM and data security
Preferred Experience and Knowledge (Good to Have):
# Competency
1.Good knowledge of traditional ETL tools (Informatica, Talend, etc) and database technologies (Oracle, MySQL, SQL Server, Postgres) with hands on experience
2.Knowledge on data governance processes (security, lineage, catalog) and tools like Collibra, Alation etc
3.Knowledge on distributed messaging frameworks like ActiveMQ / RabbiMQ / Solace, search & indexing and Micro services architectures
4.Performance tuning and optimization of data pipelines
5.CI/CD – Infra provisioning on cloud, auto build & deployment pipelines, code quality
6.Cloud data specialty and other related Big data technology certifications
Personal Attributes:
• Strong written and verbal communication skills
• Articulation skills
• Good team player
• Self-starter who requires minimal oversight
• Ability to prioritize and manage multiple tasks
• Process orientation and the ability to define and set up processes
We are Seeking:
1. AWS Serverless, AWS CDK:
Proficiency in developing serverless applications using AWS Lambda, API Gateway, S3, and other relevant AWS services.
Experience with AWS CDK for defining and deploying cloud infrastructure.
Knowledge of serverless design patterns and best practices.
Understanding of Infrastructure as Code (IaC) concepts.
Experience in CI/CD workflows with AWS CodePipeline and CodeBuild.
2. TypeScript, React/Angular:
Proficiency in TypeScript.
Experience in developing single-page applications (SPAs) using React.js or Angular.
Knowledge of state management libraries like Redux (for React) or RxJS (for Angular).
Understanding of component-based architecture and modern frontend development practices.
3. Node.js:
Strong proficiency in backend development using Node.js.
Understanding of asynchronous programming and event-driven architecture.
Familiarity with RESTful API development and integration.
4. MongoDB/NoSQL:
Experience with NoSQL databases and their use cases.
Familiarity with data modeling and indexing strategies in NoSQL databases.
Ability to integrate NoSQL databases into serverless architectures.
5. CI/CD:
Ability to troubleshoot and debug CI/CD pipelines.
Knowledge of automated testing practices and tools.
Understanding of deployment automation and release management processes.
Educational Background: Bachelor's degree in Computer Science, Engineering, or a related field.
Certification(Preferred-Added Advantage):AWS certifications (e.g., AWS Certified Developer - Associate)
Senior Java Backend Engineer
Experience: 6-8 Years
Location: Pune / Bangalore
Type: Full-time
Tech Stack: Java, AWS, Spring boot, Postgres, NO/SQL
About Digit88
Digit88 empowers digital transformation for innovative and high growth B2B and B2C SaaS companies as their trusted offshore software product engineering partner!
We are a lean mid-stage software company, with a team of 75+ fantastic technologists, backed by executives with deep understanding of and extensive experience in consumer and enterprise product development across large corporations and startups. We build highly efficient and effective engineering teams that solve real and complex problems for our partners.
With more than 50+ years of collective experience in areas ranging from B2B and B2C SaaS, web and mobile apps, e-commerce platforms and solutions, custom enterprise SaaS platforms and domains spread across Conversational AI, Chatbots, IoT, Health-tech, ESG/Energy Analytics, Data Engineering, the founding team thrives in a fast paced and challenging environment that allows us to showcase our best.
The Vision: To be the most trusted technology partner to innovative software product companies world-wide
The Opportunity
Digit88 is expanding the extended software product engineering team for its partner, a US-based Energy Analytics SaaS platform company. Our partner is building a suite of cloud-based business operation support platforms in the Utilities Rate Lifecycle space in the Energy sector/domain. This is a bleeding edge AI and Big Data platform that helps large energy utility companies in the US plan, manage, review and optimize their new product and rate design, billing, rate analysis, forecasting, and CRM. The candidate would be joining an existing team of product engineers in the US, China and Pune/India and help us establish an extended product engineering team at Digit88.
Job Profile
Digit88 is looking for a Sr Java Engineer with excellent hands-on experience in Java, AWS, and Springboot experience. You will be designing solutions around large enterprise and distributed systems. You will collaborate with a multi-disciplinary team of engineers and architects on a wide range of problems, bringing technical direction, architecture and design decisions, solving technical problems ultimately enabling the product team to build, develop and improve products that will revolutionize the energy industry.
To be successful in this role, you should possess
- Bachelor's degree in Computer Science or a related field with 6-8 years hands-on experience with Java based technologies.
- Expertise in Core Java, Data Structures, J2EE with proven expertise in Spring MVC, Spring boot, Microservices architecture, Web Services (Rest) in distributed systems
- Practical experience with MySQL and/or NoSQL databases like Postgres, Cassandra
- Practical experience with Caching frameworks Memcached/Redis, Message Queues
- Experience in building high performance, high availability REST APIs and REST clients
- Expertise with log file analysis using one or more of ELK, Splunk, Kibana
- Prior experience in building solutions in AWS, using managed services.
- Experience with Kafka is a definite plus
- Strong practical experience in applying design patterns, multithreading concepts to solve complex problems, strong problem solving skills
- Excellent communication (oral and written) and interpersonal skills and an ability to effectively communicate with both business and technical teams.
- Excellent written and verbal communication skills for presenting findings to technical and non-technical audiences.
- Possess strong organizational and time management skills, with attention to detail.
- Good understanding in CI/CD, Container architecture - Docker/Jenkins and build scripts Maven/Ant
- Experience with Kubernetes
Roles and responsibilities
- Develop Java solutions on AWS with Springboot, microservices.
- Work closely with the US and India engineering teams to help build the Java/Spring based backend and REST APIs.
- Technical excellence and ownership of critical modules; own the development of new modules and features
- Troubleshoot live production server issues
- Able to work as a part of a team, be able to contribute independently and drive the team to exceptional contributions with minimal team supervision
- Perform Unit Testing and Integration testing in a Continuous Deployment scenario
- Follow Agile methodology, JIRA for work planning, issue management/tracking
- Bring Technical direction, design considerations and decisions as part of solutioning
Good to have/Preferred Qualifications:
- Experience in the utility or energy industries.
- Experience working with a start-up.
Additional Project/Soft Skills:
- Should be able to work independently with India & US based team members.
- Strong verbal and written communication with ability to articulate problems and solutions over phone and emails.
- Strong sense of urgency, with a passion for accuracy and timeliness.
- Ability to work calmly in high pressure situations and manage multiple projects/tasks.
- Ability to work independently and possess superior skills in issue resolution.
Benefits/Culture @ Digit88:
- Comprehensive Insurance (Life, Health, Accident)
- Flexible Work Model
- Accelerated learning & non-linear growth
- Flat organization structure driven by ownership and accountability.
- Global Peers - Working with some of the best engineers/professionals globally from the likes of Apple, Amazon, IBM Research, Adobe and other innovative product companies
- Ability to make a global impact with your work, leading innovations in Conversational AI, Tele-Medicine, Healthcare and more.
You will work with a founding team of serial entrepreneurs with multiple successful exits to their credit. The learning will be immense just as will the challenges.
This is the right time to join us and partner in our growth!
Publicis Sapient Overview:
The Senior Associate People Senior Associate L1 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution
.
Job Summary:
As Senior Associate L1 in Data Engineering, you will do technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution
The role requires a hands-on technologist who has strong programming background like Java / Scala / Python, should have experience in Data Ingestion, Integration and data Wrangling, Computation, Analytics pipelines and exposure to Hadoop ecosystem components. Having hands-on knowledge on at least one of AWS, GCP, Azure cloud platforms will be preferable.
Role & Responsibilities:
Job Title: Senior Associate L1 – Data Engineering
Your role is focused on Design, Development and delivery of solutions involving:
• Data Ingestion, Integration and Transformation
• Data Storage and Computation Frameworks, Performance Optimizations
• Analytics & Visualizations
• Infrastructure & Cloud Computing
• Data Management Platforms
• Build functionality for data ingestion from multiple heterogeneous sources in batch & real-time
• Build functionality for data analytics, search and aggregation
Experience Guidelines:
Mandatory Experience and Competencies:
# Competency
1.Overall 3.5+ years of IT experience with 1.5+ years in Data related technologies
2.Minimum 1.5 years of experience in Big Data technologies
3.Hands-on experience with the Hadoop stack – HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in building end to end data pipeline. Working knowledge on real-time data pipelines is added advantage.
4.Strong experience in at least of the programming language Java, Scala, Python. Java preferable
5.Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc
Preferred Experience and Knowledge (Good to Have):
# Competency
1.Good knowledge of traditional ETL tools (Informatica, Talend, etc) and database technologies (Oracle, MySQL, SQL Server, Postgres) with hands on experience
2.Knowledge on data governance processes (security, lineage, catalog) and tools like Collibra, Alation etc
3.Knowledge on distributed messaging frameworks like ActiveMQ / RabbiMQ / Solace, search & indexing and Micro services architectures
4.Performance tuning and optimization of data pipelines
5.CI/CD – Infra provisioning on cloud, auto build & deployment pipelines, code quality
6.Working knowledge with data platform related services on at least 1 cloud platform, IAM and data security
7.Cloud data specialty and other related Big data technology certifications
Job Title: Senior Associate L1 – Data Engineering
Personal Attributes:
• Strong written and verbal communication skills
• Articulation skills
• Good team player
• Self-starter who requires minimal oversight
• Ability to prioritize and manage multiple tasks
• Process orientation and the ability to define and set up processes
Do you have what it takes?
· Proficient in C, C++ development, OS concepts, Data Structures, Distributed computing, and Algorithms.
· Actively engaged in projects related to architecture and design (architecture, design patterns, reliability, and scaling).
· Excellent problem-solving capability with strong fundamentals in Computer Science.
· Highly proficient in data structures and algorithms.
· Ability to pick up new technologies rapidly to convert specifications to low level design.
· Ability to clearly communicate the intent and approach when producing a code design document.
How can you contribute to our success?
· Develop frontend as well as backend components for delivering product on cloud platforms.
· Participate in activities that lead up to the code design. This may include activities such as developing prototypes, proof of concept, compare pros and cons of various implementation choices being considered and recommending the most appropriate one.
· Mentor and influence junior developers to adhere to good coding standards and become quality conscious.
· Review critical pieces of code that are developed as per this code design.
· Ensure that the code being delivered is of the highest quality.
· Ensure appropriate unit tests and functional as well as non-functional tests are considered.
· Conceptualize and work towards building processes, methodology, and tools to improve team’s ability to deliver high quality software.
Our organization relies on its central engineering workforce to develop and maintain a product portfolio of several different startups. As part of our engineering, you'll work on several products every quarter. Our product portfolio continuously grows as we incubate more startups, which means that different products are very likely to make use of different technologies, architecture & frameworks - a fun place for smart tech lovers!
Responsibilities:
- 3 - 5 years of experience working in DevOps/DevSecOps.
- Strong hands-on experience in deployment, administration, monitoring, and hosting services of Kubernetes.
- Ensure compliance with security policies and best practices for RBAC (Kubernetes) and AWS services like EKS, VPC, IAM, EC2, Route53, and S3.
- Implement and maintain CI/CD pipelines for continuous integration and deployment.
- Monitor and troubleshoot issues related to infrastructure and application security.
- Develop and maintain automation scripts using Terraform for infrastructure as code.
- Hands-on experience in configuring and managing a service mesh like Istio.
- Experience working in Cloud, Agile, CI/CD, and DevOps environments. We live in the Cloud.
- Experience with Jenkins, Google Cloud Build, or similar.
- Good to have experience with using PAAS and SAAS services from AWS like Big Query, Cloud Storage, S3, etc.,
- Good to have experience with configuring, scaling, and monitoring database systems like PostgreSQL, MySQL, MongoDB, and so on.
Requirements:
- Bachelor's degree in Computer Science, Information Technology, or a related field.
- 3 to 7 years of experience in DevSecOps or a related field.
- Great hands-on experience in Kubernetes, EKS, AWS services like VPC, IAM, EC2, and S3, Terraform, terraform backed and ternary resources of Terraform, and CICD tools like Jenkins.
- Strong understanding of DevOps principles and practices.
- Experience in developing and maintaining automation scripts using Terraform.
- Proficiency in any Linux scripting language.
- Familiarity with security tools and technologies such as WAF, IDS/IPS, and SIEM.
- Excellent problem-solving skills and ability to work independently and within a team.
- Good communication skills and ability to work collaboratively with cross-functional teams.
- Availability to join immediately to 15 days in Pune, Bangalore, or Noida locations.
About CAW Studios:
CAW Studios is a Product Engineering Studio. WE BUILD TRUE PRODUCT TEAMS for our clients. Each team is a small, well-balanced group of geeks and a product manager who together produce relevant and high-quality products. We believe the product development process needs to be fixed as most development companies operate as IT Services. Unlike IT Apps, product development requires Ownership, Creativity, Agility, and design that scales.
Know More About CAW Studios:
Website: https://www.cawstudios.com/">https://www.cawstudios.com/
Benefits: Startup culture, powerful Laptop, free snacks, cool office, flexible work hours and work-from-home policy, medical insurance, and most importantly - the opportunity to work on challenging cutting-edge problems.
at CodeCraft Technologies Private Limited
Position: Senior Backend Developer (NodeJS)
Experience: 5+ Years
Location: Bengaluru
CodeCraft Technologies multi-award-winning creative engineering company offering design and technology solutions on mobile, web and cloud platforms.
We are looking for an enthusiastic and self-driven Backend Engineer to join our team.
Roles and Responsibilities:
● Develop high-quality software design and architecture
● Identify, prioritize and execute tasks in the software development life cycle.
● Develop tools and applications by producing clean, efficient code
● Automate tasks through appropriate tools and scripting
● Review and debug code
● Perform validation and verification testing
● Collaborate with cross-functional teams to fix and improve products
● Document development phases and monitor systems
● Ensure software is up-to-date with the latest technologies
Desired Profile:
● NodeJS [Typescript]
● MongoDB [NoSQL DB]
● MySQL, PostgreSQL
● AWS - S3, Lambda, API Gateway, Cloud Watch, ECR, ECS, Fargate, SQS / SNS
● Terraform, Kubernetes, Docker
● Good Understanding of Serverless Architecture
● Proven experience as a Senior Software Engineer
● Extensive experience in software development, scripting and project management
● Experience using system monitoring tools (e.g. New Relic) and automated testing frameworks
● Familiarity with various operating systems (Linux, Mac OS, Windows)
● Analytical mind with problem-solving aptitude
● Ability to work independently
Good to Have:
● Actively contribute to relevant open-source projects, demonstrating a commitment to community collaboration and continuous learning.
● Share knowledge and insights gained from open-source contributions with the development team
● AWS Solutions Architect Professional Certification
● AWS DevOps Professional Certification
● Multi-Cloud/ hybrid cloud experience
● Experience in building CI/CD pipelines using AWS services
Looking for Java Developer| Bangalore to join a team of rockstar developers. The candidate should have a minimum of 4+ yrs. There are multiple openings. If you're looking for career growth & a chance to work with the top 0.1% of developers in the industry, this one is for you! You will report into IIT'ans/BITS grads with 10+ years of development experience + work with F500 companies (our customers).
Company Background - CodeVyasa is a Software Product-Engineering and Development company that helps Early-stage & Mid-Market Product companies with IT Consulting, App Development, and On-demand Tech Resources. Our Journey over the last 3 years has been nothing short of a roller-coaster. Along our way, we've won some of the most prestigious awards while driving immense value to our customers & employees. Here's the link to our website (codevyasa.com). To give you a sense of our growth rate, we've added 70+ employees in the last 6 weeks itself and expect another 125+ by the end of Q1 2024.
Requirements:
- Bachelor's degree in Computer Science, Information Technology, or related field (or equivalent experience).
- Minimum of 4 years of experience as a Java Developer.
- Proficiency in AWS development.
- Aptitude for learning new technologies quickly.
- Good problem-solving and analytical skills.
What We Offer:
- Glassdoor rating of 4.8, indicating high employee satisfaction.
- Free healthcare benefits.
- Strong focus on upskilling and professional development opportunities.
- Diverse and inclusive work environment.
- Competitive compensation and benefits package.
- Emphasis on maintaining a healthy work-life balance
Job Description-
Responsibilities:
- Create blueprints for application deployment, build Terraform modules for infrastructure provisioning, and enhance security with automation tools like Harness and Flux.
- Work with cutting-edge CI/CD tooling and orchestrate deployment using AWS, Azure, Docker Swarm, and Kubernetes.
-Architect and deploy CI/CD platforms to facilitate rapid development and deployment cycles.
- Design and implement Kubernetes environments, API Gateways, and self-service portals.
- Evaluate and integrate third-party tools and services for improved CI/CD processes.
- Educate and lead a team of DevOps engineers in best practices and operational excellence.
- Employ SonarCloud for continuous code quality checks and JMeter for application performance testing.
Qualifications:
- 7+ years of experience with architecting and deploying applications on AWS or Microsoft Azure.
- Proficiency with CI/CD tooling such as GitHub, GitLab, Jenkins, or Harness.
- Hands-on experience with container orch
Job Description: Full Stack Developer Company: Arroz Technology Private Limited CTC: 5 LPA
Location : Bangalore (Onsite)
Responsibilities:
- Design and develop scalable and high-performance web applications using the
MERN (MongoDB, Express.js, React.js, Node.js) stack.
- Collaborate with cross-functional teams to gather requirements and translate them into high-level designs.
- Write clean, reusable, and well-structured code following industry best practices and coding standards.
- Mentor and guide junior developers, providing technical expertise and promoting Professional growth.
- Conduct code reviews and provide constructive feedback to ensure code quality and adherence to standards.
- Collaborate with frontend and backend developers to integrate components and ensure smooth data flow.
- Work with UI/UX designers to implement responsive and user-friendly interfaces.
- Stay updated with the latest trends and advancements in full-stack development technologies.
- Work in a 10 AM to 6 PM, six-day office role, maintaining regular attendance and punctuality.
Required Skills and Qualifications:
-Strong proficiency in MERN (MongoDB, Express.js, React.js, Node.js) stack development.
-Experience with Redux or similar state management libraries.
-Solid understanding of front-end technologies such as HTML, CSS, and JavaScript.
-Proficiency in RESTful API development and integration.
-Familiarity with version control systems like Git and agile development methodologies.
-Good problem-solving and debugging skills.
-Excellent communication and teamwork abilities.
-Bachelor's degree in Computer Science or a related field (preferred).
Join Arroz Technology Private Limited as a Full Stack Developer and contribute to the development of cutting-edge web applications. This role offers competitive compensation and growth opportunities within a dynamic work environment.
Design a multi-tier data pipeline to feed data into applications for building a full-featured analytics environment. Develop high-quality code to support the platform's technical architecture and design. Participate and contribute to an effective software development lifecycle using Scrum and Agile. Collaborate with global teams and work as one team. what you get to do:
You'll work on the design, implementation, and maintenance of data pipelines. Design and build database schemas to handle large-scale data migration & transformation. Capable of designing a high-performance, scalable, distributed product in the cloudAWS, GCS. Review developmental frameworks, and coding standards, conducts code reviews and walkthroughs, and conduct in-depth design reviews. Identify gaps in the existing infrastructure and advocate for the necessary changes to close them. Who we are looking for:
4 to 7 years of industry experience working in Spark and Scala/Python. Working experience with big-data tech stacks like Spark, Kafka & Athena. Extensive experience in SQL query optimization/tuning and debugging SQL performance issues. Experience in ETL/ELT process to move data through the data processing pipeline. Be a fearless leader in championing smart design. Top 3 primary skills and expertise level requirements ( 1 to 5; 5 being expert)
Excellent programming experience in Scala or Python. Good experience in SQL queries and optimizations. 2 to 3 years of Spark experience. Nice to have experience in Airflow. Nice to have experience with AWS EMR, Lambda, and S3.
Employment Type - FULLTIME
Industry Type - Media / Entertainment / Internet
Seniority Level - Mid-Senior-Level
Work Experience(in years) - 4 - 7 Years
Education - B.Tech/B.E.
Skills - Python, Scala, Ms Sql Server, Aws
Job Requirements
- 3+ years of experience with custom application development.
- Must be dedicated, passionate, and hard-working. Attitude is everything.
- Must be able to work with a team and collaborate. Hard workers and self-starters please apply.
- We are looking for a creative and efficient problem solver.
Technical Requirements
- Experience working within Agile development environments.
- Experience with Node.js, PHP, Laravel, React.js, JavaScript, HTML5, CSS3.
- Experience with Redux, Redux Saga
- OAuth and JWT Tokens experience is a plus.
- Experience with the Next.js and Nest.js framework is a plus.
- Advanced Laravel Experience (Middleware, Collections, Policies, and Service Containers).
- Experience with unit testing (PHP Unit).
- Experience with MySQL profiling and query optimization.
- Solid working experience building RESTful APIs.
- Active experience integrating custom code with 3rd party web services.
- Hands-on experience with tools such as Git and Jira.
- Experience working in the AWS (Amazon Web Services) ecosystem.
Job Duties
- Work with a team of developers, BA, PM, QA, etc. to execute strategies and implement solutions to build quality business software applications.
- Develop front-end and back-end components for large data-driven applications.
- Review features requests, provide feedback, and develop/maintain features for web applications.
- Document your development process and development components.
- Work with other developers to complete tasks and share ideas.
Job Title: Senior Full Stack Engineer
Location: Bangalore
About threedots:
At threedots, we are committed to helping our customers navigate the complex world of secured credit financing. Our mission is to facilitate financial empowerment through innovative, secured credit solutions like Loans Against Property, Securities, FD & More. Founded by early members of Groww, we are a well funded startup with over $4M in funding from India’s top investors.
Role Overview:
The Senior Full Stack Engineer will be responsible for developing and managing our web infrastructure and leading a team of talented engineers. With a solid background in both front and back-end technologies, and a proven track record of developing scalable web applications, the ideal candidate will have a hands-on approach and a leader's mindset.
Key Responsibilities:
- Lead the design, development, and deployment of our Node and ReactJS-based applications.
- Architect scalable and maintainable web applications that can handle the needs of a rapidly growing user base.
- Ensure the technical feasibility and smooth integration of UI/UX designs.
- Optimize applications for maximum speed and scalability.
- Implement comprehensive security and data protection.
- Manage and review code contributed by the team and maintain high standards of software quality.
- Deploy applications on AWS/GCP and manage server infrastructure.
- Work collaboratively with cross-functional teams to define, design, and ship new features.
- Provide technical leadership and mentorship to other team members.
- Keep abreast with the latest technological advancements to leverage new tech and tools.
Minimum Qualifications:
- Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
- Minimum 3 years of experience as a full-stack developer.
- Proficient in Node.js and ReactJS.
- Experience with cloud services (AWS/GCP).
- Solid understanding of web technologies, including HTML5, CSS3, JavaScript, and responsive design.
- Experience with databases, web servers, and UI/UX design.
- Strong problem-solving skills and the ability to make sound architectural decisions.
- Proven ability to lead and mentor a tech team.
Preferred Qualifications:
- Experience in fintech
- Strong knowledge of software development methodologies and best practices.
- Experience with CI/CD pipelines and automated testing.
- Familiarity with microservices architecture.
- Excellent communication and leadership skills.
What We Offer:
- The opportunity to be part of a founding team and shape the company's future.
- Competitive salary with equity options.
- A creative and collaborative work environment.
- Professional growth opportunities as the company expands.
- Additional Startup Perks