
Title: Data Engineer (Azure) (Location: Gurgaon/Hyderabad)
Salary: Competitive as per Industry Standard
We are expanding our Data Engineering Team and hiring passionate professionals with extensive
knowledge and experience in building and managing large enterprise data and analytics platforms. We
are looking for creative individuals with strong programming skills, who can understand complex
business and architectural problems and develop solutions. The individual will work closely with the rest
of our data engineering and data science team in implementing and managing Scalable Smart Data
Lakes, Data Ingestion Platforms, Machine Learning and NLP based Analytics Platforms, Hyper-Scale
Processing Clusters, Data Mining and Search Engines.
What You’ll Need:
- 3+ years of industry experience in creating and managing end-to-end Data Solutions, Optimal
Data Processing Pipelines and Architecture dealing with large volume, big data sets of varied
data types.
- Proficiency in Python, Linux and shell scripting.
- Strong knowledge of working with PySpark dataframes, Pandas dataframes for writing efficient pre-processing and other data manipulation tasks.
● Strong experience in developing the infrastructure required for data ingestion, optimal
extraction, transformation, and loading of data from a wide variety of data sources using tools like Azure Data Factory, Azure Databricks (or Jupyter notebooks/ Google Colab) (or other similiar tools).
- Working knowledge of github or other version control tools.
- Experience with creating Restful web services and API platforms.
- Work with data science and infrastructure team members to implement practical machine
learning solutions and pipelines in production.
- Experience with cloud providers like Azure/AWS/GCP.
- Experience with SQL and NoSQL databases. MySQL/ Azure Cosmosdb / Hbase/MongoDB/ Elasticsearch etc.
- Experience with stream-processing systems: Spark-Streaming, Kafka etc and working experience with event driven architectures.
- Strong analytic skills related to working with unstructured datasets.
Good to have (to filter or prioritize candidates)
- Experience with testing libraries such as pytest for writing unit-tests for the developed code.
- Knowledge of Machine Learning algorithms and libraries would be good to have,
implementation experience would be an added advantage.
- Knowledge and experience of Datalake, Dockers and Kubernetes would be good to have.
- Knowledge of Azure functions , Elastic search etc will be good to have.
- Having experience with model versioning (mlflow) and data versioning will be beneficial
- Having experience with microservices libraries or with python libraries such as flask for hosting ml services and models would be great.

About Scry AI
About
Scry AI invents, designs, and develops cutting-edge technology-based Enterprise solutions powered by Machine Learning, Natural Language Processing, Big Data, and Computer Vision.
Scry AI is an R&D organization leading innovation in business automation technology and has been helping companies and businesses transform how they work.
Catering to core industries like Fintech, Healthcare, Communication, Mobility, and Smart Cities, Scry has invested heavily in R&D to build cutting-edge product suites that address challenges and roadblocks that plague traditional business environments.
Connect with the team
Similar jobs
Sourcing, providing jobseeker information from various sources, handling the jobseeker details, coordinate with the candidates
Position: AWS Data Engineer
Experience: 5 to 7 Years
Location: Bengaluru, Pune, Chennai, Mumbai, Gurugram
Work Mode: Hybrid (3 days work from office per week)
Employment Type: Full-time
About the Role:
We are seeking a highly skilled and motivated AWS Data Engineer with 5–7 years of experience in building and optimizing data pipelines, architectures, and data sets. The ideal candidate will have strong experience with AWS services including Glue, Athena, Redshift, Lambda, DMS, RDS, and CloudFormation. You will be responsible for managing the full data lifecycle from ingestion to transformation and storage, ensuring efficiency and performance.
Key Responsibilities:
- Design, develop, and optimize scalable ETL pipelines using AWS Glue, Python/PySpark, and SQL.
- Work extensively with AWS services such as Glue, Athena, Lambda, DMS, RDS, Redshift, CloudFormation, and other serverless technologies.
- Implement and manage data lake and warehouse solutions using AWS Redshift and S3.
- Optimize data models and storage for cost-efficiency and performance.
- Write advanced SQL queries to support complex data analysis and reporting requirements.
- Collaborate with stakeholders to understand data requirements and translate them into scalable solutions.
- Ensure high data quality and integrity across platforms and processes.
- Implement CI/CD pipelines and best practices for infrastructure as code using CloudFormation or similar tools.
Required Skills & Experience:
- Strong hands-on experience with Python or PySpark for data processing.
- Deep knowledge of AWS Glue, Athena, Lambda, Redshift, RDS, DMS, and CloudFormation.
- Proficiency in writing complex SQL queries and optimizing them for performance.
- Familiarity with serverless architectures and AWS best practices.
- Experience in designing and maintaining robust data architectures and data lakes.
- Ability to troubleshoot and resolve data pipeline issues efficiently.
- Strong communication and stakeholder management skills.
Senior Software Engineer
Gocomet
**Desired Candidate**
- The ideal candidate is a self-motivated, multi-tasker, and demonstrated team-player. You will be a senior developer responsible for the development of new software products and enhancements to existing products.
- You should excel in working with large-scale applications and frameworks and have outstanding communication and leadership skills.
**Responsibilities**
- Writing clean, high-quality, high-performance, maintainable code
- Develop and support software including applications, database integration, interfaces, and new functionality enhancements
- Own and complete full projects beginning with identifying and communicating the problems to be solved, getting and incorporating feedback on proposed architectural solutions, and making a final decision as the owner of a project.
- Show curiosity to not only learn new things but fully understand how they work
- Be highly productive - have a reputation for getting things done quickly and efficiently
- Be a mentor for other engineers
- Deconstruct a problem into an executable action plan for themselves and other engineers - also perform them in a high-quality way without issue
- Set and maintain high individual and team expectations
- Actively participate in frequent code/design/architecture reviews
- Be able to communicate well with all engineers regardless of seniority
- Generate support for a company/team decision
**Requirements**
- At least 2 years experience in Development with extensive experience using Ruby/Golang/Python/Nodejs.
- Excellent understanding of Object Oriented Programming
- Ability to self-manage and work autonomously in a collaborative environment
- A focus on detail including around automated tests and documenting your code
- An agile mindset and the ability to adapt to changing priorities and requirements
- Good in analyzing and solving problems
- Passionate to work in a start-up
**What you will get**
- Product ownership - take autonomy over core products & product features
- Be a part of early tech team
- Stock options
**Our stack**
Microservice Architecture, Kubernetes, PostgreSQL, MongoDB, Redis, Ruby on Rails, ReactJs, Nodejs, Jenkins, RabbitMQ, Flutter, Apache Kafka
Our continuous releases are integrated with Jenkins, Bitbucket & Kubernetes. On the frontend, we use React for the views, organize the data flow with Flux architecture, and test our application with RSpec.
On the backend, we're a Rails shop (ROR) riding on AWS/GCP and Postgres RDS.
Working Days: 5 (Saturday and all Sunday’s off).
Why GoComet?
About GoComet (www.gocomet.com)
GoComet - our Logistics Resource Management (LRM) SaaS platform leverages the combined power of data science and machine intelligence. It facilitates sharp reverse auctions bringing out the best possible end to end rates for shipments, saves time, optimises operations, and increases deal transparency and efficiencies for enterprises’ freight procurement processes.
Owing to our growing impact and potential, the Singapore Government (SGInnovate) is now backing us as an investor. Also, our global customers (including Fortune 500 Conglomerates) like Schaeffler, Glenmark, Sun Pharma, Polyplex, Indorama Ventures - trust, and recommend us.
Besides, we were also recently mentioned in the Gartner Visibility Guide.
Role - Technology Lead
Role Overview
As a Tech lead, you will take an active role in the definition and evolution of standard practices and procedures, working closely with the co-founders in building a team. You will be responsible for defining and developing software for tasks associated with the developing, designing and debugging of the software applications. The technology lead will be a part of the core leadership team and will be involved in all crucial decisions. So apart from strong technical skills, leadership is something we are looking for.
Responsibilities
▪ Work Experience between 3 to 5 years in large scale applications with desire to work in fast paced startups
▪ Must have led a team of 3 to 10 engineers. Excellent business understanding and prioritization skills. Fully hands on with coding and up to date with technology trends.
▪ Focus on code maintainability and performance of application and a demonstrated product development ability.
▪ Must be excellent with hiring and should have built a team from scratch or small size
▪ Provide technical advice and assists in solving programming problems
▪ Continuously create new and interactive features that would help to improve user experience and user engagement.
▪ Assist with troubleshooting of issues as needed
Tech Stack -
Node Js, Nativescript, Angular, React, React Native, MongoDB, AWS.
Responsibilities
- Lead the app team to develop apps using Flutter for both Android & iOS platforms.
- Implement best practices for app development, usage tracking, and issue fixes.
- Do regular code reviews to ensure code quality.
- Be an individual contributor to write quality code that is simple, reliable, and scalable.
- Implement test-driven development.
- Ensure the best performance and user experience of the application.
- Work with the team to manage, optimize, and customize multiple applications.
- Evaluate and implement out-of-the-box ideas for application development.
Required Skills
- Must have 8+ years of experience in native mobile development (Android / iOS).
- Must have 2+ years of experience in Flutter with Dart.
- Must have worked and successfully deployed apps on play-store/app-store using Flutter.
- Experience with any of the state management solutions like Bloc, Provider, Mobx, etc.
- Extreme attention to detail and the ability to match the design as closely as possible.
Job Perks
- Get to work with a highly passionate team of engineers.
- Open and embracing culture towards the latest hot technologies.
- A high level of freedom & responsibility.
- Company Activation Specialist team is accountable for tactical Buying and implementation for a set of clients.
- WM Activation Specialist team to understands clients’ brand identity. Agency’s deliverables and campaign KPI’s.
- Understand the campaign brief received from Agency Specialist lead, and provide benchmarking/ planning rates / final negotiation in line with SOP.
- Coordination with other stakeholders, mainly the WM Planning team for deal closures
- Negotiate with Vendors for your activation plan including campaign value adds, FCT management, campaign execution etc.
- Optimize the activation plan to achieve the objectives/ KPI.
- Close the optimized plan with Activation Specialist lead.
- Track the live campaigns on daily/ weekly basis in terms of spot implementation, campaign performance, mid- evaluations etc.
- Campaign management including performance and corrective measures. Campaign inflight optimization is compulsorily done for every single campaign
- Follow end to end process for Campaign execution and Campaign management.
Key Responsibilities:
- Understand Business requirements in BI context and design data models to transform raw data into meaningful insights
- Create dashboards and interactive visual reports using Power BI
- Analyze data and present data through reports that aid decision-making
- Design, develop, test, and deploy Power BI scripts and perform detailed analytics
- Communicate with Business units & leadership team and aim at better visualization and transparency of data analytics, driving insightful business strategies and improved business performances
- Partner with other Technology teams to create Data models, Charts & Reports in a scalable and controlled environment
Knowledge/Experience:
- A bachelor’s degree in data science, data analytics, computer science, mathematics, statistics, econometrics, financial engineering, computer or electrical engineering, or other related quantitative fields
- 4-5 years of experience in Data Preparation & Analysis, Business Intelligence using Data visualization tools (e.g.: Power BI) and sound knowledge of JavaScript, SQL
- An understanding of Machine Learning and Artificial Intelligence (AI)
- Experience with analysis in FMCG or Electronic sector is preferred
- Strong knowledge of Microsoft Excel & PowerPoint
Skills/Qualifications:
- Strong analytical skills and ability to work in a fast-paced environment
- Business acumen to understand complex business problems and translate them into analysis that leads to actionable business insights
- Strong communication skills (oral and written) to explain complex data and analytical problems
- Willingness to learn and to explore new ideas, with independent thinking and attention to details
- Strong work ethics, the can-do attitude and excellent interpersonal skills
Tableau Developer Responsibilities:
- Developing, maintaining, and managing advanced reporting, analytics, dashboards and other BI solutions.
- Performing and documenting data analysis, data validation, and data mapping/design.
- Reviewing and improving existing systems and collaborating with teams to integrate new systems.
- Conducting unit tests and developing database queries to analyze the effects and troubleshoot any issues.
- Creating tools to store data within the organization.









