
AWS Glue Developer
Work Experience: 6 to 8 Years
Work Location: Noida, Bangalore, Chennai & Hyderabad
Must Have Skills: AWS Glue, DMS, SQL, Python, PySpark, Data integrations and Data Ops,
Job Reference ID:BT/F21/IND
Job Description:
Design, build and configure applications to meet business process and application requirements.
Responsibilities:
7 years of work experience with ETL, Data Modelling, and Data Architecture Proficient in ETL optimization, designing, coding, and tuning big data processes using Pyspark Extensive experience to build data platforms on AWS using core AWS services Step function, EMR, Lambda, Glue and Athena, Redshift, Postgres, RDS etc and design/develop data engineering solutions. Orchestrate using Airflow.
Technical Experience:
Hands-on experience on developing Data platform and its components Data Lake, cloud Datawarehouse, APIs, Batch and streaming data pipeline Experience with building data pipelines and applications to stream and process large datasets at low latencies.
➢ Enhancements, new development, defect resolution and production support of Big data ETL development using AWS native services.
➢ Create data pipeline architecture by designing and implementing data ingestion solutions.
➢ Integrate data sets using AWS services such as Glue, Lambda functions/ Airflow.
➢ Design and optimize data models on AWS Cloud using AWS data stores such as Redshift, RDS, S3, Athena.
➢ Author ETL processes using Python, Pyspark.
➢ Build Redshift Spectrum direct transformations and data modelling using data in S3.
➢ ETL process monitoring using CloudWatch events.
➢ You will be working in collaboration with other teams. Good communication must.
➢ Must have experience in using AWS services API, AWS CLI and SDK
Professional Attributes:
➢ Experience operating very large data warehouses or data lakes Expert-level skills in writing and optimizing SQL Extensive, real-world experience designing technology components for enterprise solutions and defining solution architectures and reference architectures with a focus on cloud technology.
➢ Must have 6+ years of big data ETL experience using Python, S3, Lambda, Dynamo DB, Athena, Glue in AWS environment.
➢ Expertise in S3, RDS, Redshift, Kinesis, EC2 clusters highly desired.
Qualification:
➢ Degree in Computer Science, Computer Engineering or equivalent.
Salary: Commensurate with experience and demonstrated competence

Similar jobs
Strong Lead – User Research & Analyst profile (behavioural/user/product/ux analytics)
Mandatory (Experience 1): Must have 10+ years of experience in Behavioral Data Analytics, User Research, or Product Insights, driving data-informed decision-making for B2C digital products (web and app).
Mandatory (Experience 2): Must have 6 months+ experience in analyzing user journeys, clickstream, and behavioral data using tools such as Google Analytics, Mixpanel, CleverTap, Firebase, or Amplitude.
Mandatory (Experience 3): Experience in leading cross-functional user research and analytics initiatives in collaboration with Product, Design, Engineering, and Business teams to translate behavioral insights into actionable strategies.
Mandatory (Skills 1): Strong expertise in A/B testing and experimentation, including hypothesis design, execution, statistical validation, and impact interpretation.
Mandatory (Skills 2): Ability to identify behavioral patterns, funnel drop-offs, engagement trends, and user journey anomalies using large datasets and mixed-method analysis.
Mandatory (Skills 3): Hands-on proficiency in SQL, Excel, and data visualization/storytelling tools such as Tableau, Power BI, or Looker for executive reporting and dashboard creation.
Mandatory (Skills 4): Deep understanding of UX principles, customer journey mapping, and product experience design, with experience integrating qualitative and quantitative insights.
Mandatory (Company): B2C product organizations (fintech, e-commerce, edtech, or consumer platforms) with large-scale user datasets and analytics maturity.
Mandatory (Note): Don’t want data analysts; looking for strategic behavioral insight leaders or research-driven analytics professionals focused on user behavior and product decision-making.
Data Scientist
Job Id: QX003
About Us:
QX impact was launched with a mission to make AI accessible and affordable and deliver AI Products/Solutions at scale for enterprises by bringing the power of Data, AI, and Engineering to drive digital transformation. We believe without insights, businesses will continue to face challenges to better understand their customers and even lose them; Secondly, without insights businesses won't’ be able to deliver differentiated products/services; and finally, without insights, businesses can’t achieve a new level of “Operational Excellence” is crucial to remain competitive, meeting rising customer expectations, expanding markets, and digitalization.
Position Overview:
We are seeking a collaborative and analytical Data Scientist who can bridge the gap between business needs and data science capabilities. In this role, you will lead and support projects that apply machine learning, AI, and statistical modeling to generate actionable insights and drive business value.
Key Responsibilities:
- Collaborate with stakeholders to define and translate business challenges into data science solutions.
- Conduct in-depth data analysis on structured and unstructured datasets.
- Build, validate, and deploy machine learning models to solve real-world problems.
- Develop clear visualizations and presentations to communicate insights.
- Drive end-to-end project delivery, from exploration to production.
- Contribute to team knowledge sharing and mentorship activities.
Must-Have Skills:
- 3+ years of progressive experience in data science, applied analytics, or a related quantitative role, demonstrating a proven track record of delivering impactful data-driven solutions.
- Exceptional programming proficiency in Python, including extensive experience with core libraries such as Pandas, NumPy, Scikit-learn, NLTK and XGBoost.
- Expert-level SQL skills for complex data extraction, transformation, and analysis from various relational databases.
- Deep understanding and practical application of statistical modeling and machine learning techniques, including but not limited to regression, classification, clustering, time series analysis, and dimensionality reduction.
- Proven expertise in end-to-end machine learning model development lifecycle, including robust feature engineering, rigorous model validation and evaluation (e.g., A/B testing), and model deployment strategies.
- Demonstrated ability to translate complex business problems into actionable analytical frameworks and data science solutions, driving measurable business outcomes.
- Proficiency in advanced data analysis techniques, including Exploratory Data Analysis (EDA), customer segmentation (e.g., RFM analysis), and cohort analysis, to uncover actionable insights.
- Experience in designing and implementing data models, including logical and physical data modeling, and developing source-to-target mappings for robust data pipelines.
- Exceptional communication skills, with the ability to clearly articulate complex technical findings, methodologies, and recommendations to diverse business stakeholders (both technical and non-technical audiences).
- Experience in designing and implementing data models, including logical and physical data modeling, and developing source-to-target mappings for robust data pipelines.
- Exceptional communication skills, with the ability to clearly articulate complex technical findings, methodologies, and recommendations to diverse business stakeholders (both technical and non-technical audiences).
Good-to-Have Skills:
- Experience with cloud platforms (Azure, AWS, GCP) and specific services like Azure ML, Synapse, Azure Kubernetes and Databricks.
- Familiarity with big data processing tools like Apache Spark or Hadoop.
- Exposure to MLOps tools and practices (e.g., MLflow, Docker, Kubeflow) for model lifecycle management.
- Knowledge of deep learning libraries (TensorFlow, PyTorch) or experience with Generative AI (GenAI) and Large Language Models (LLMs).
- Proficiency with business intelligence and data visualization tools such as Tableau, Power BI, or Plotly.
- Experience working within Agile project delivery methodologies.
Competencies:
· Tech Savvy - Anticipating and adopting innovations in business-building digital and technology applications.
· Self-Development - Actively seeking new ways to grow and be challenged using both formal and informal development channels.
· Action Oriented - Taking on new opportunities and tough challenges with a sense of urgency, high energy, and enthusiasm.
· Customer Focus - Building strong customer relationships and delivering customer-centric solutions.
· Optimizes Work Processes - Knowing the most effective and efficient processes to get things done, with a focus on continuous improvement.
Why Join Us?
- Be part of a collaborative and agile team driving cutting-edge AI and data engineering solutions.
- Work on impactful projects that make a difference across industries.
- Opportunities for professional growth and continuous learning.
- Competitive salary and benefits package.
- Role & Responsibilities:
- Develop and maintain backend functionality for websites and web applications.
- Create custom WordPress themes and plugins based on requirements.
- Work with REST API integration and database management (MySQL).
- Follow MVC principles for structured code development.
- Collaborate with frontend developers to integrate designs and functionality.
- Use Git for version control.
Phyllo is a developer tool for developers and businesses to access data of creators / digital solopreneurs directly from the source platforms (e.g. YouTube, Twitch, Upwork, Shopify) with their consent. Think of us as an "account aggregator" but for the creator economy through which creators / digital solopreneurs can link their accounts and share their data with any developer/business who needs it. Such data once shared can act as proof of a creator's digital identity, proof of their work or income.
We are backed by reputed investors and have raised a $15M Series A from RTP Global, Nexus Venture Partners and top angels from Facebook, Teachable, Coinbase and more.
More info at:
- Our Crunchbase Profile: https://www.crunchbase.com/organization/phyllo
- Our coverage by cryptechie, a popular US blog on technology and web3: https://www.cryptechie.com/p/creatoreconomy
You will be responsible for:
- Running Outreach Campaigns to acquire new leads and qualify them.
- Discover and hunt new target prospects.
- Facilitating product demonstrations and technical audits.
- You will own the responsibility for building a healthy sales pipeline and meeting quarterly and annual lead targets.
- You must be able to collaborate within a team environment while holding yourself accountable to the highest of standards.
Here is how you will continuously add value to the organisation:
- Research and identify new business prospects; identify trends in the creator economy to discover new developers building for creators.
- Identify ways of connecting with these prospects to discuss business opportunities.
- Understand client’s needs and consult and present them with relevant solutions. Demonstrate Phyllo’s capabilities relevant to them.
- Manage the CRM and log deal data accordingly.
- Attend and participate in weekly sales team meetings.
- Attend and participate in product development and strategy meetings, presenting input from the field and acting as the “voice of the customer.”
Job Requirements:
We are looking for someone with:
- 1-3 years of B2B SaaS or Developer Tool Sales experience
- Sales experience in an early-stage startup
- Strong communication and interpersonal skills.
- Willingness to build a career in sales
Skill set: ASP.Net, Web, MySQL, C+, HTML, ASP.Net Angular(good to have),
Job Description as mentioned below: ·
MS SQL; Database design,
Stored Procedure, Functions,
Views, triggers, and advanced queries ·
C# version 4 or higher. ·
Visual Studio 2013 or later. ·
ASP.NET web forms knowledge ·
Advanced JQuery/JavaScript ·
Ability to understand the existing applications and work on enhancements · Ability to understand requirements quickly · Documentation skills , Requirement Analysis ·
Good Communication skills
Magassians is looking for a passionate backend engineer. It will be expected from you to build pragmatic solutions on mission-critical initiatives. If you know your stuff, see the beauty in code, have knowledge in depth and breadth, advocate best practices, and love to work with distributed systems, then this is an ideal position for you.
You will be responsible for:
- Designing the architecture of the core platforms of Magna5's products.
- Spend the majority of your time writing and reviewing code.
- Work with the team in defining the timelines for the deliverables
- Share knowledge with peers through brown bags and meetups
We are looking for:
- 3 -5 years of overall software development experience
- 1 year of Go programming experience
- Expertise working on SQL and No-SQL databases
- Experience using HTTP and AMQP transport mediums and streaming technologies like Kafka and NATS-Streaming
- Inner-drive to take the team to the next level in a fast-paced environment
Interview Process:
Our interview process is short and straightforward, with three rounds.
- Code - The first thing we look at is your ability to write code and tests. If you can share something, we can take a look great. Otherwise, we will send you an assignment.
- Technical Round - We schedule a technical round that can go up to 2 hours, where we have technical discussions plus remote pairing sessions.
- Culture Fit & Compensation - We double-check the potential team member is a culture fit. Culture fit is as important to us as writing code. Once we know that you are a great fit, we discuss and finalize numbers on the same call. You will have three days to take up the offer.
Job Type : Permanent
Experience : 5+ years
Job Location : Jasola, New Delhi | Onsite
RESPONSIBILITIES :
- Lead the Sheeva product teams in implementing and onboarding new applications and use cases in the Sheeva cloud platform.
- Design and develop data architectures and use cases based on market needs.
- Lead the development of web-based dashboards, mobile applications, and API integrations for new use cases, linked to the Sheeva telematics algorithms.
- Manage third-party contractors involved in design and development of dashboards, mobile apps, and other technical products.
- Oversee the development of API documentation and manuals.
- Interact with Sheeva sales and business development teams to identify the client needs, their feasibility and implementation guidelines.
- Participate in client meetings when necessary.
- Support customer API integration processes and technical onboarding.
REQUIREMENTS :
- 5+ experience building and creating SaaS ecosystems with complex architecture and multiple elements.
- The technical skills and experience to become a specialist on Sheeva's products, to map those products to customers' needs, and to craft and construct solutions to fill those needs.
- Experience building complex API architectures, documentation, and integrations with multiple enterprise clients across various industries.
- Database and service architecture experience, especially SQL.
- Managerial experience leading a motivated international team of developers and data scientists, both front-end and back-end.
- Ability to code rapidly and efficiently in various tech stacks.
- The consulting skills to thrive as the face of Sheeva Services in front of Customers' executives and engineers alike, leading discovery, whiteboarding, and pair coding sessions.
- A background in contactless payments and/or app's technology is a plus.
- Proven attention to detail through prior work or life experience.
- Comfortable with a variety of responsibilities.
- Comfortable with a minimal amount of direction but high expectations.
- Fits the Sheeva culture.
- Position may require travel of up to 10% per quarter.
- Graduated from top tier universities is a plus.
- Create well-designed, tested code using best practices for website development, including mobile and responsive site design
- Integrate data from various back-end services and databases
- Create and maintain software documentation
- Stay plugged into emerging technologies/industry trends and apply them into operations and activities
- Cooperate with web designers to match visual design intent
- B Tech/BE or M.Tech/ME in Computer Science or equivalent from a reputed college.
- Experience level of 7+ years in building large scale applications.
- Strong problem solving skills, data structures and algorithms.
- Experience with distributed systems handling large amount of data.
- Excellent coding skills in Java / Python / Node / Go.
- Very good understanding of Web Technologies.
- Very good understanding of any RDBMS and/or messaging.











