11+ AWS Simple Queuing Service (SQS) Jobs in Pune | AWS Simple Queuing Service (SQS) Job openings in Pune
Apply to 11+ AWS Simple Queuing Service (SQS) Jobs in Pune on CutShort.io. Explore the latest AWS Simple Queuing Service (SQS) Job opportunities across top companies like Google, Amazon & Adobe.

consulting & implementation services in the area of Oil & Gas, Mining and Manufacturing Industry
- Data Engineer
Required skill set: AWS GLUE, AWS LAMBDA, AWS SNS/SQS, AWS ATHENA, SPARK, SNOWFLAKE, PYTHON
Mandatory Requirements
- Experience in AWS Glue
- Experience in Apache Parquet
- Proficient in AWS S3 and data lake
- Knowledge of Snowflake
- Understanding of file-based ingestion best practices.
- Scripting language - Python & pyspark
CORE RESPONSIBILITIES
- Create and manage cloud resources in AWS
- Data ingestion from different data sources which exposes data using different technologies, such as: RDBMS, REST HTTP API, flat files, Streams, and Time series data based on various proprietary systems. Implement data ingestion and processing with the help of Big Data technologies
- Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part of business logic and implement it using the language supported by the base data platform
- Develop automated data quality check to make sure right data enters the platform and verifying the results of the calculations
- Develop an infrastructure to collect, transform, combine and publish/distribute customer data.
- Define process improvement opportunities to optimize data collection, insights and displays.
- Ensure data and results are accessible, scalable, efficient, accurate, complete and flexible
- Identify and interpret trends and patterns from complex data sets
- Construct a framework utilizing data visualization tools and techniques to present consolidated analytical and actionable results to relevant stakeholders.
- Key participant in regular Scrum ceremonies with the agile teams
- Proficient at developing queries, writing reports and presenting findings
- Mentor junior members and bring best industry practices
QUALIFICATIONS
- 5-7+ years’ experience as data engineer in consumer finance or equivalent industry (consumer loans, collections, servicing, optional product, and insurance sales)
- Strong background in math, statistics, computer science, data science or related discipline
- Advanced knowledge one of language: Java, Scala, Python, C#
- Production experience with: HDFS, YARN, Hive, Spark, Kafka, Oozie / Airflow, Amazon Web Services (AWS), Docker / Kubernetes, Snowflake
- Proficient with
- Data mining/programming tools (e.g. SAS, SQL, R, Python)
- Database technologies (e.g. PostgreSQL, Redshift, Snowflake. and Greenplum)
- Data visualization (e.g. Tableau, Looker, MicroStrategy)
- Comfortable learning about and deploying new technologies and tools.
- Organizational skills and the ability to handle multiple projects and priorities simultaneously and meet established deadlines.
- Good written and oral communication skills and ability to present results to non-technical audiences
- Knowledge of business intelligence and analytical tools, technologies and techniques.
Familiarity and experience in the following is a plus:
- AWS certification
- Spark Streaming
- Kafka Streaming / Kafka Connect
- ELK Stack
- Cassandra / MongoDB
- CI/CD: Jenkins, GitLab, Jira, Confluence other related tools
What You’ll Do:
As a Sr. Data Scientist, you will work closely across DeepIntent Data Science teams located in New York, India, and Bosnia. The role will focus on building predictive models, implementing data-driven solutions to maximize ad effectiveness. You will also lead efforts in generating analyses and insights related to the measurement of campaign outcomes, Rx, patient journey, and supporting the evolution of the DeepIntent product suite. Activities in this position include developing and deploying models in production, reading campaign results, analyzing medical claims, clinical, demographic and clickstream data, performing analysis and creating actionable insights, summarizing, and presenting results and recommended actions to internal stakeholders and external clients, as needed.
- Explore ways to create better predictive models.
- Analyze medical claims, clinical, demographic and clickstream data to produce and present actionable insights.
- Explore ways of using inference, statistical, and machine learning techniques to improve the performance of existing algorithms and decision heuristics.
- Design and deploy new iterations of production-level code.
- Contribute posts to our upcoming technical blog.
Who You Are:
- Bachelor’s degree in a STEM field, such as Statistics, Mathematics, Engineering, Biostatistics, Econometrics, Economics, Finance, or Data Science.
- 5+ years of working experience as a Data Scientist or Researcher in digital marketing, consumer advertisement, telecom, or other areas requiring customer-level predictive analytics.
- Advanced proficiency in performing statistical analysis in Python, including relevant libraries, is required.
- Experience working with data processing, transformation and building model pipelines using tools such as Spark, Airflow, and Docker.
- You have an understanding of the ad-tech ecosystem, digital marketing and advertising data and campaigns or familiarity with the US healthcare patient and provider systems (e.g. medical claims, medications).
- You have varied and hands-on predictive machine learning experience (deep learning, boosting algorithms, inference…).
- You are interested in translating complex quantitative results into meaningful findings and interpretable deliverables, and communicating with less technical audiences orally and in writing.
- You can write production level code, work with Git repositories.
- Active Kaggle participant.
- Working experience with SQL.
- Familiar with medical and healthcare data (medical claims, Rx, preferred).
- Conversant with cloud technologies such as AWS or Google Cloud.
Key Responsibilities :
- Design, develop, and maintain applications using Java and Kotlin
- Write clean, scalable, and efficient code
- Build and consume RESTful APIs and microservices
- Participate in all phases of the software development lifecycle
- Work with databases like MySQL, PostgreSQL, or MongoDB
- Collaborate with cross-functional teams including Product, QA, and DevOps
- Conduct unit testing and assist in code reviews
- Troubleshoot and debug applications
- Ensure application performance, security, and scalability
Required Skills :
- Strong programming experience in Java (Core/Advanced)
- Experience in Kotlin
- Solid understanding of OOP concepts, design patterns, and data structures
- Experience with frameworks such as Spring Boot, Ktor, or Android SDK
- Proficient in building and consuming RESTful APIs
- Familiarity with Git, JIRA, and CI/CD tools
- Basic knowledge of unit testing frameworks like JUnit or Mockito
Job Summary
We are looking for a skilled Java + Cloud Developer to design, develop, and maintain high-performance applications. The ideal candidate will have strong expertise in Core Java, Spring Framework, multithreading, and database management, along with exposure to cloud platforms and containerization technologies.
Job Title: Java + Cloud Developer
Location: Pune / Mumbai / Bangalore
Experience Level: 4-8
Employment Type: Full-time
Key Responsibilities
- Design, develop, and maintain scalable Java applications using Core Java, Spring Framework, JDBC, and multithreading concepts.
- Implement and integrate database solutions using relational and NoSQL databases.
- Utilize JDBC for database connectivity and manipulation.
- Work with cloud platforms such as Azure or GCP; experience with DevOps practices is an added advantage.
- Develop, deploy, and manage applications using containerization technologies (Docker, Kubernetes).
- Debug and troubleshoot applications through log analysis and monitoring tools.
- Collaborate with cross-functional teams to ensure seamless integration between multi-service components.
- Handle large-scale data processing tasks effectively; hands-on experience with Apache Spark is a plus.
- Apply Agile methodologies (Scrum/Kanban) in daily development activities.
- Continuously research and adopt new technologies to improve development processes and methodologies.
Required Skills & Qualifications
- Strong proficiency in Core Java (Java 8 or higher) with a deep understanding of threading and concurrent programming.
- Solid experience with the Spring Framework and its various modules (Spring Boot, Spring MVC, Spring Data, etc.).
- Experience with RDBMS (e.g., MySQL, PostgreSQL, Oracle) and NoSQL databases (e.g., MongoDB, Cassandra).
- Basic understanding of cloud platforms (Azure, GCP, or AWS).
- Knowledge of DevOps practices (CI/CD, version control, monitoring tools) is a plus.
- Familiarity with Docker and Kubernetes for application deployment and scaling.
- Strong analytical and problem-solving skills.
- Good communication skills and ability to work in a collaborative environment.
Preferred Qualifications
- Hands-on experience with Apache Spark for big data processing.
- Exposure to microservices architecture and API integrations.
- Familiarity with log monitoring tools (ELK, Splunk, etc.).
Note : Serving Notice OR 30 Days NP
and analyzing complex information is second to none, and you have a natural desire to help people understand things that are hard to understand.
Objectives of this Role
• Develop comprehensive documentation that meets organizational
standards
• Obtain a deep understanding of products and services to translate
complex product information into simple, polished, and engaging
content
• Write user-friendly content that meets the needs of the target audience,
turning insights into language that sets our users up for success
• Develop and maintain detailed databases of appropriate reference
materials, including research, usability tests, and design specifications.
• Evaluate current content and develop innovative approaches for
improvement
Daily and Monthly Responsibilities
• Research, outline, write and edit new and existing content, working
closely with various departments to understand project requirements
• Independently gather information from subject matter experts to
develop, organize, and write procedure manuals, technical
specifications, and process documentation
• Work with development and support leads to identify all documentation
repositories, revise, edit, and determine the best solution for data
compilation and centralized storage.
• Research, create, and maintain information architecture templates that
uphold organizational and legal standards, and allow for easy data
migration
• Develop content in alternative media forms for maximum usability, with
a consistent and cohesive voice across all documentation.
Skills and Qualifications
• Bachelor’s degree in a relevant technical field
• 2-4 years of industry experience as an influential technical writer
• Knowledge about using the tools and applications required for creating
the content effectively
• Proven ability to quickly learn and understand complex topics
• Previous experience writing documentation and procedural materials
for multiple audiences
• Superior written and verbal communication skills, with a keen eye for
detail.
• Experience working with engineering to improve the user experience:
design, UI, and help refine content and create visuals and diagrams for
technical support content
Preferred Qualifications
• Proven ability to handle multiple projects simultaneously, with an eye
for prioritization
• Firm understanding of the systems development life cycle (SDLC)
• Previous software development experience
• Certification through the Society for Technical Communicators
• Experience using XML tools to create documentation
1. Strong knowledge in Front end scripting like EJS, JavaScript, Jquery
2. Proficiency with fundamental front-end languages such as HTML, CSS.
3. Familiarity with JavaScript frameworks such as Angular JS, React, and Amber.
4. Proficiency with server-side languages such as Python / Ruby / Java / PHP/ .Net
5. Good Understand with database technology such as MySQL, Oracle, and MongoDB.
Review all job requirements and specifications required for deploying the solution into the production environment.
Perform various unit/tests as per the checklist on deployment steps with help of test cases and maintain documents for the same.
Work with Lead to resolve all issues within the required timeframe and inform for any delays.
Collaborate with the development team to review new programs for implementation activities and manage communication (if required) with different functions to resolve issues and assist implementation leads to manage production deployments.
Document all issues during the deployment phase and document all findings from logs/during actual deployment and share the analysis.
Review and maintain all technical and business documents. Conduct and monitor software implementation lifecycle and assist/make appropriate customization to all software for clients as per the deployment/implementation guide
Train new members on product deployment, issues and identify all issues in processes and provide solutions for the same.
Ensure project tasks as appropriately updated in JIRA / ticket tool for in-progress/done and raise the issues.
Should take self-initiative to learn/understand the technologies i.e. Vertica SQL, Internal Data integration tool (Athena), Pulse Framework, Tableau.
Flexible to work during non-business hours in some exceptional cases (for a few days) required to meet the client time zones.
Experience on Tools and Technologies preferred:
ETL Tools: Talend or Informatica ,Abinitio,Datastage
BI Tools: Tableau or Jaspersoft or Pentaho or Qlikview experience
Database: Experience in Oracle or SS
Methodology: Experience in SDLC and/or Agile Methodology



