11+ AWS Simple Queuing Service (SQS) Jobs in Chennai | AWS Simple Queuing Service (SQS) Job openings in Chennai
Apply to 11+ AWS Simple Queuing Service (SQS) Jobs in Chennai on CutShort.io. Explore the latest AWS Simple Queuing Service (SQS) Job opportunities across top companies like Google, Amazon & Adobe.
Role Summary
As a Data Engineer, you will be an integral part of our Data Engineering team supporting an event-driven server less data engineering pipeline on AWS cloud, responsible for assisting in the end-to-end analysis, development & maintenance of data pipelines and systems (DataOps). You will work closely with fellow data engineers & production support to ensure the availability and reliability of data for analytics and business intelligence purposes.
Requirements:
· Around 4 years of working experience in data warehousing / BI system.
· Strong hands-on experience with Snowflake AND strong programming skills in Python
· Strong hands-on SQL skills
· Knowledge with any of the cloud databases such as Snowflake,Redshift,Google BigQuery,RDS,etc.
· Knowledge on debt for cloud databases
· AWS Services such as SNS, SQS, ECS, Docker, Kinesis & Lambda functions
· Solid understanding of ETL processes, and data warehousing concepts
· Familiarity with version control systems (e.g., Git/bit bucket, etc.) and collaborative development practices in an agile framework
· Experience with scrum methodologies
· Infrastructure build tools such as CFT / Terraform is a plus.
· Knowledge on Denodo, data cataloguing tools & data quality mechanisms is a plus.
· Strong team player with good communication skills.
Overview Optisol Business Solutions
OptiSol was named on this year's Best Companies to Work for list by Great place to work. We are a team of about 500+ Agile employees with a development center in India and global offices in the US, UK (United Kingdom), Australia, Ireland, Sweden, and Dubai. 16+ years of joyful journey and we have built about 500+ digital solutions. We have 200+ happy and satisfied clients across 24 countries.
Benefits, working with Optisol
· Great Learning & Development program
· Flextime, Work-at-Home & Hybrid Options
· A knowledgeable, high-achieving, experienced & fun team.
· Spot Awards & Recognition.
· The chance to be a part of next success story.
· A competitive base salary.
More Than Just a Job, We Offer an Opportunity To Grow. Are you the one, who looks out to Build your Future & Build your Dream? We have the Job for you, to make your dream comes true.
Google Data Engineer - SSE
Position Description
Google Cloud Data Engineer
Notice Period: Immediate to 30 days serving
Job Description:
We are seeking a highly skilled Data Engineer with extensive experience in Google Cloud Platform (GCP) data services and big data technologies. The ideal candidate will be responsible for designing, implementing, and optimizing scalable data solutions while ensuring high performance, reliability, and security.
Key Responsibilities:
• Design, develop, and maintain scalable data pipelines and architectures using GCP data services.
• Implement and optimize solutions using BigQuery, Dataproc, Composer, Pub/Sub, Dataflow, GCS, and BigTable.
• Work with GCP databases such as Bigtable, Spanner, CloudSQL, AlloyDB, ensuring performance, security, and availability.
• Develop and manage data processing workflows using Apache Spark, Hadoop, Hive, Kafka, and other Big Data technologies.
• Ensure data governance and security using Dataplex, Data Catalog, and other GCP governance tooling.
• Collaborate with DevOps teams to build CI/CD pipelines for data workloads using Cloud Build, Artifact Registry, and Terraform.
• Optimize query performance and data storage across structured and unstructured datasets.
• Design and implement streaming data solutions using Pub/Sub, Kafka, or equivalent technologies.
Required Skills & Qualifications:
• 8-15 years of experience
• Strong expertise in GCP Dataflow, Pub/Sub, Cloud Composer, Cloud Workflow, BigQuery, Cloud Run, Cloud Build.
• Proficiency in Python and Java, with hands-on experience in data processing and ETL pipelines.
• In-depth knowledge of relational databases (SQL, MySQL, PostgreSQL, Oracle) and NoSQL databases (MongoDB, Scylla, Cassandra, DynamoDB).
• Experience with Big Data platforms such as Cloudera, Hortonworks, MapR, Azure HDInsight, IBM Open Platform.
• Strong understanding of AWS Data services such as Redshift, RDS, Athena, SQS/Kinesis.
• Familiarity with data formats such as Avro, ORC, Parquet.
• Experience handling large-scale data migrations and implementing data lake architectures.
• Expertise in data modeling, data warehousing, and distributed data processing frameworks.
• Deep understanding of data formats such as Avro, ORC, Parquet.
• Certification in GCP Data Engineering Certification or equivalent.
Good to Have:
• Experience in BigQuery, Presto, or equivalent.
• Exposure to Hadoop, Spark, Oozie, HBase.
• Understanding of cloud database migration strategies.
• Knowledge of GCP data governance and security best practices.
Technical Architect (Databricks)
- 10+ Years Data Engineering Experience with expertise in Databricks
- 3+ years of consulting experience
- Completed Data Engineering Professional certification & required classes
- Minimum 2-3 projects delivered with hands-on experience in Databricks
- Completed Apache Spark Programming with Databricks, Data Engineering with Databricks, Optimizing Apache Spark™ on Databricks
- Experience in Spark and/or Hadoop, Flink, Presto, other popular big data engines
- Familiarity with Databricks multi-hop pipeline architecture
Sr. Data Engineer (Databricks)
- 5+ Years Data Engineering Experience with expertise in Databricks
- Completed Data Engineering Associate certification & required classes
- Minimum 1 project delivered with hands-on experience in development on Databricks
- Completed Apache Spark Programming with Databricks, Data Engineering with Databricks, Optimizing Apache Spark™ on Databricks
- SQL delivery experience, and familiarity with Bigquery, Synapse or Redshift
- Proficient in Python, knowledge of additional databricks programming languages (Scala)
Key Responsibility Areas:
- Should have excellent knowledge of Swift and Objective C
- Good working knowledge in Cocoa Touch
- Experience with performance and memory tuning with tools
- Experience with memory management & caching mechanisms specific to mobile devices
- Experience with third-party libraries and APIs
- Experience working with Core Data, Realm
- Understanding of the full mobile development life cycle
- Experience in publishing apps to the App Store.
- Code version tool – Git, Bitbucket
- Design Pattern - MVC and MVVM, MVP
- Must be able to provide individual or project oversight on rapid prototyping/POC
- efforts and large scale enterprise wide roll out planning.
- Must be familiar with software development methodologies like Agile, Waterfall, Iterative etc.
- Must have strong analytical skills and should be able to define and build competency assets – estimators, tools, reusable assets, scripts, etc.
- Must have strong communication and customer interfacing skills with particular emphasis on Scope and Requirements Management
- Experience in IoT domain will be a big plus
Required Skills:
- Experience in publishing apps to the App Store.
- Code version tool – Git, Bitbucket
- Design Pattern - MVC and MVVM, MVP
- Must be able to provide individual or project oversight on rapid prototyping/POC
- efforts and large scale enterprise wide roll out planning.
- Must be familiar with software development methodologies like Agile, Waterfall, Iterative etc.
3-8 years
About Company:
The company is a global leader in secure payments and trusted transactions. They are at the forefront of the digital revolution that is shaping new ways of paying, living, doing business and building relationships that pass on trust along the entire payments value chain, enabling sustainable economic growth. Their innovative solutions, rooted in a rock-solid technological base, are environmentally friendly, widely accessible and support social transformation.
Requirements:
1. RECENT EXPERIENCE - Has worked on VB .NET over recent 1 year
2. VB.NET/ASP .NET with SQL server experience
3. Web application development with hosting on IIS experience.
4. Basics of Jquery
5. good communication skills
6. good attitude/team -working skills

Product Based Cargotec Corporation Company
Bachelor's degree in computer science or any other equivalent degree.
Strong knowledge of and experience with object-oriented methodologies, enterprise application
architectures and design pattern, use of automated testing frameworks
Handy with VisualStudio Code, Intellij, PgAdmin, Docker, and Postman.
Experience with RestAPI and websockets.
Experience of 1+ years in Angular and JavaScript. Vue and React are a plus.
Experience developing in Java at least 1.5 years.
Experience with relational database management systems, (at least PostgreSQL and MsSQL).
Nonrelational database engines is a plus & ability to write well-documented, clean Javascript code

Reputed Product based MNC company . Location Chennai
SharePoint site as well as Excel files from local HRIS systems
Analyze data on a monthly, quarterly, and annual basis regarding recruitment and
attrition
Prepare data reports using Excel and Power Point
Ensure data integrity by identifying issues and fixing errors either in SharePoint or local
HRIS system
Track recruitment costs on a regional and global basis
Work closely with Finance team to monitor global recruitment budget costs and new
hire/attrition pipeline
Lead HR data projects, such as data review/audit and mass data changes
Assist HR team with various data projects, as requested
Required Skills
Intermediate Microsoft Office skills, including Excel, Power Point and Word
Strong attention to detail
Excellent verbal and written communication skills
Ability to build strong working relationships across different locations and departments
Strong analytics and problem solving – able to identify anomalies in data and follow up
to find an explanation as to why
Must maintain confidentiality among the
Ésah Tea is a brand devoted to giving you the ultimate tea experience. We believe that joy and warmth can be shared in many forms but is best when shared with a cup of tea. Our tea is made with absolute care and honesty by the best growers in Assam who believe in only giving the best.
Our tea is the perfect harmony of aesthetics, flavor, complexity, and health. Our mission to give you the best tea experience as a result of the respect we have for the art and craft of tea making and its culture. Directly from the tea gardens of Assam, from the most authentic growers with finesse in the cultivation of tea, our mission is to give you the best experience. We care about our environment and believe in sustainable and ethical extraction and production. Our initiative to switch to plastic-free teabags is a result of adopting sustainability as an ethic.
Ésah Tea recently raised a Pre Series A round from a well-known VC and has some milestones that the company would like to achieve. As it is well known in the eCommerce industry, achieving higher milestones month after month, year after year is a standard. Hence, Ésah Tea is looking to build a core team that can take it towards the next stage of its eCommerce journey. So if you think that you are perfect to work in a Startup, Let's have a chat.
The Job
- Manage paid acquisition channels like Google Ads and Facebook Ads. Analyze the results and optimize individual campaigns.
- Be responsible for the ROI of dozens of campaigns you execute.
- Own, drive, and report on crucial marketing efficiency metrics such as CAC, LTV, and, ROAS
- Understand your target audience and customer behaviors. You create strategies and campaigns, understanding the entire user journey from account creation until purchase.
- Analyze creative data and make recommendations to the Creative designers
- Come up with a hypothesis and test it scientifically to consolidate learnings, thereby increasing the baseline of the team's quality.
- Document findings in a structured way to contribute to the company's global knowledge base
Collaborate with international peers in other geographical markets to keep the highest possible baseline quality in your campaigns.
The Ideal Candidate
- 2-4 years of paid marketing experience, preferably in consumer-facing products
- Fluent in English (Advanced writing and speaking English skills)
- Experience with creative experimentation
- Experience with fast-paced, high growth startups
- Strong analytical and problem-solving skills
- A natural strategist
- Makes fast business decisions that prove to be profitable
Nice to have
- SQL
- Understanding of ad platforms algorithms
Experience with programmatic media (DSP)
Note: Must Be willing to relocate to Guwahati, Assam
Responsibilities
- Meeting with the development team to discuss user interface ideas and applications.
- Reviewing application requirements and interface designs.
- Identifying web-based user interactions.
- Developing and implementing highly responsive user interface components using React concepts.
- Writing application interface codes using JavaScript following React.js workflows.
- Troubleshooting interface software and debugging application codes.
- Developing and implementing front-end architecture to support user interface concepts.
- Monitoring and improving front-end performance.
- Documenting application changes and developing updates.
Requirements
- Bachelor’s degree in Computer Science, Information Technology, or a similar field.
- 2-7 years of work experience in React.Js and Redux.
- In-depth knowledge of JavaScript, CSS, HTML, and front-end languages.
- Knowledge of REACT tools including React.js, Webpack, Enzyme, Redux, and Flux.
- Experience with user interface design.
- Experience with browser-based debugging and performance testing software.
- Excellent troubleshooting skills.
- Good project management skills.
Job Dsecription:
○ Develop best practices for team and also responsible for the architecture
○ solutions and documentation operations in order to meet the engineering departments quality and standards
○ Participate in production outage and handle complex issues and works towards Resolution
○ Develop custom tools and integration with existing tools to increase engineering Productivity
Required Experience and Expertise
○ Having a good knowledge of Terraform + someone who has worked on large TF code bases.
○ Deep understanding of Terraform with best practices & writing TF modules.
○ Hands-on experience of GCP and AWS and knowledge on AWS Services like VPC and VPC related services like (route tables, vpc endpoints, privatelinks) EKS, S3, IAM. Cost aware mindset towards Cloud services.
○ Deep understanding of Kernel, Networking and OS fundamentals
NOTICE PERIOD - Max - 30 days




