

Position: AWS Data Engineer
Experience: 5 to 7 Years
Location: Bengaluru, Pune, Chennai, Mumbai, Gurugram
Work Mode: Hybrid (3 days work from office per week)
Employment Type: Full-time
About the Role:
We are seeking a highly skilled and motivated AWS Data Engineer with 5–7 years of experience in building and optimizing data pipelines, architectures, and data sets. The ideal candidate will have strong experience with AWS services including Glue, Athena, Redshift, Lambda, DMS, RDS, and CloudFormation. You will be responsible for managing the full data lifecycle from ingestion to transformation and storage, ensuring efficiency and performance.
Key Responsibilities:
- Design, develop, and optimize scalable ETL pipelines using AWS Glue, Python/PySpark, and SQL.
- Work extensively with AWS services such as Glue, Athena, Lambda, DMS, RDS, Redshift, CloudFormation, and other serverless technologies.
- Implement and manage data lake and warehouse solutions using AWS Redshift and S3.
- Optimize data models and storage for cost-efficiency and performance.
- Write advanced SQL queries to support complex data analysis and reporting requirements.
- Collaborate with stakeholders to understand data requirements and translate them into scalable solutions.
- Ensure high data quality and integrity across platforms and processes.
- Implement CI/CD pipelines and best practices for infrastructure as code using CloudFormation or similar tools.
Required Skills & Experience:
- Strong hands-on experience with Python or PySpark for data processing.
- Deep knowledge of AWS Glue, Athena, Lambda, Redshift, RDS, DMS, and CloudFormation.
- Proficiency in writing complex SQL queries and optimizing them for performance.
- Familiarity with serverless architectures and AWS best practices.
- Experience in designing and maintaining robust data architectures and data lakes.
- Ability to troubleshoot and resolve data pipeline issues efficiently.
- Strong communication and stakeholder management skills.

Similar jobs
Job Title: QA Engineer
Location: Gurgaon (Onsite)
Joining: Immediate joiner Only
ISTQB Certified.
Experience: 2–4 years
Key Responsibilities
- Design, develop, and execute manual test cases for web and mobile applications.
- Perform API testing (REST/SOAP) using tools like Postman, Swagger, or similar.
- Write and execute complex SQL queries to validate data and ensure database integrity.
- Identify, report, and track defects; collaborate with developers for quick resolution.
- Review and analyze system specifications and requirements for creating effective test plans.
- Conduct regression, functional, integration, and system testing as part of release cycles.
- Document and maintain detailed test results, defect logs, and reports.
- Participate actively in Agile ceremonies (daily stand-ups, sprint planning, retrospectives).
Must-Have Skills
- Strong hands-on experience in Manual Testing for web and mobile platforms.
- Solid experience in API Testing (REST / SOAP).
- Handson writing complex SQL queries.
- Strong knowledge and practical SQL skills for database validation.
- Good understanding of software development lifecycle (SDLC) and Agile methodologies.
- Strong analytical and troubleshooting skills.
Nice-to-Have
- Basic understanding of automation tools or frameworks (optional).
- Experience with JIRA or other bug tracking tools.
Other Details
- Employment Type: Full-time
- Work Mode: Onsite at Gurgaon
- Notice Period: Immediate joiners / candidates with notice period up to 7 -10 days Only

Immediate Joiners Preferred
NOTE: Working Shift: 11 am - 8 pm IST (+/- one hour on need basis) Monday to Friday
Responsibilities :
1. Leverage Alteryx to build, maintain, execute and deliver data assets. This entails using data analytics tools (combination of proprietary tools, Excel, and scripting), validating raw data quality, and working with Tableau/ Power BI and other geospatial data visualization applications
2. Analyze large data sets in Alteryx, finding patterns and providing a concise summary
3. Implement, maintain, and troubleshoot analytics solution and derive key insights based on Tableau/ Power BI visualization and data analysis
4. Improve ETL setup and develop components so they can be rapidly deployed and configured
5. Conduct data assessment, perform data quality checks and transform data using SQL and ETL tools
Qualification :
1. 3+ years of relevant and professional work experience with a reputed analytics firm
2. Bachelor's degree in Engineering / Information Technology from a reputed college
3. Must have the knowledge to handle/design/optimize complex ETL using Alteryx
4. Expertise with visualization tools such as Tableau /Power BI to solve a business problem related to data exploration and visualization
5. Good knowledge in handling large amounts of data through SQL, T-SQL or PL-SQL
6. Basic knowledge in Python for data processing will be good to have
7. Very good understanding of data warehousing and data lake concepts


Job Category: Software Development
Job Type: Full Time
Job Location: Bangalore
Gnani.ai aims to empower enterprises with AI based speech technology.
Gnani.ai is an AI-based Speech Recognition and NLP Startup that is working on voice-based solutions for large businesses. AI is the biggest innovation that is disrupting the market and we are at the heart of this disruption. Funded by one of the largest global conglomerates in the world, and backed a number of market leaders in the tech industry,
We are working with some of the largest companies in the banking, insurance, e-commerce and financial services sectors and we are not slowing down. With aggressive expansion plans, Gnani.ai aims to be the leader in the global market for voice-based solutions.
Gnani.ai is building the future for voice-based business solutions. If you are fascinated by AI and would like to work on the latest AI technologies in a high-intense, fast-growing and flexible work environment with immense growth opportunities, come and join us. We are looking for hard workers, who are ready to take on big challenges.
NLP Software Developer
Gnani.ai is looking to hire software developers with 0 to 2+ Years of experience, with a keen interest in designing and developing chat and voice bots. We are looking for an Engineer who can work with us in developing an NLP framework if you have the below skill set
Requirements :
- Proficient knowledge of Python
- Proficient understanding of code versioning tools, such as Git / SVN.
- Good knowledge of algorithms to find and implement tools for NLP tasks
- Knowledge of NLP libraries and frameworks
- Understanding of text representation techniques, algorithms, statistics
- Syntactic & Semantic Parsing
- Knowledge/work experience on No-SQL database Mongo.
- Good knowledge of Docker container technologies.
- Strong communication skills
Responsibilities :
- Develop NLP systems according to requirements
- Maintain NLP libraries and frameworks
- Design and develop natural language processing systems
- Define appropriate datasets for language learning
- Use effective text representations to transform natural language into useful features
- Train the developed model and run evaluation experiments
- Find and implement the right algorithms and tools for NLP tasks
- Perform statistical analysis of results and refine models
- Constantly keep up to date with the field of machine learning
- Implement changes as needed and analyze bugs
Good To Have :
Start up experience is a plus
education space catering to various B-Schools, Universities, and premium Institutes
nationally and internationally, having deep corporate connections and over 2 Lakh alumni.
Jaro education is also successfully catering to the needs of working professionals by offering
them varied choices in management and technology programs from reputed institutes /
universities / colleges for pursuing the course online.
With various academic achievements and accolades, Jaro Education is known for providing
students with the most innovative & successful online management and technology
programs in India.
Jaro is awarded with Best Employer for “Career Development and Leadership Grooming" By
Employer Branding Awards (EBA) in February 2016. We have received prestigious National
Level awards for our exceptional contribution in education industry.
Job Description:
1. Connecting with minimum 80 working professionals (prospective students) each day
from the leads/database present on leadsquared CRM
2. Generating a pool of prospects by identifying the need of upskilling depending on
the student’s area of interest
3. Helping prospective students with the detailed information about the programs
offered through phone or video counselling & create a strong pipeline
4. Ensure to meet daily deliverables & achieve weekly/monthly enrolment target
Remuneration – Ranging from 4 LPA to 6.25 LPA fixed + Monthly Variables
Working days - Monday to Saturday
Office Timings - 10:00 a.m. to 7:00 p.m.
- Must be a student of a prestigious institution.
- Socially Active
- Present in campus
Skills
- Good Communication Skills
- Convincing Skills
Roles and Responsibilities
- To bring quality tutors to our platform.
- Share posters provided by TutorBin on your campus and LinkedIn
- Circulating posters, videos through email, whatsapp, instagram, facebook etc.
Rewards and Incentives
- Certificate of working as a campus ambassador.
- Rs 1500 in the first month.
- Reference benefits
- Opportunity to become part of our future programs/ projects.
- TutorBin T-shirts will be given

- 3-5yrs of practical DS experience working with varied data sets. Working with retail banking is preferred but not necessary.
- Need to be strong in concepts of statistical modelling – particularly looking for practical knowledge learnt from work experience (should be able to give "rule of thumb" answers)
- Strong problem solving skills and the ability to articulate really well.
- Ideally, the data scientist should have interfaced with data engineering and model deployment teams to bring models / solutions to "live" in production.
- Strong working knowledge of python ML stack is very important here.
- Willing to work on diverse range of tasks in building ML related capability on the Corridor Platform as well as client work.
- Someone with strong interest in data engineering aspect of ML is highly preferred, i.e. can play dual role of Data Scientist as well as someone who can code a module on our Corridor Platform writing robust code.
Structured ML techniques for candidates:
- GBM
- XgBoost
- Random Forest
- Neural Net
- Logistic Regression


good Experience in Working with Codeigniter/laraval, Jquery, Ajax, Angular Js, Bootstrap with Good Designing Skills, Mysql, Mongodb ( Optional ), Wordpress ( Optional )
responsibilities
write Clean, Well-designed Code
troubleshoot and Test Core Product Software to Ensure Strong Optimization
contribute to all Phases of the Development Lifecycle
qualifications.
bachelor\'s Degree in Computer Science or Related Field
experience in Software Development
passion for Best Design and Coding Practices
strong Knowledge of Relational Databases, Tools, and Php Skills.

