Objective
Data Engineer will be responsible for expanding and optimizing our data and database architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building. The Data Engineer will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products
Roles and Responsibilities:
- Should be comfortable in building and optimizing performant data pipelines which include data ingestion, data cleansing and curation into a data warehouse, database, or any other data platform using DASK/Spark.
- Experience in distributed computing environment and Spark/DASK architecture.
- Optimize performance for data access requirements by choosing the appropriate file formats (AVRO, Parquet, ORC etc) and compression codec respectively.
- Experience in writing production ready code in Python and test, participate in code reviews to maintain and improve code quality, stability, and supportability.
- Experience in designing data warehouse/data mart.
- Experience with any RDBMS preferably SQL Server and must be able to write complex SQL queries.
- Expertise in requirement gathering, technical design and functional documents.
- Experience in Agile/Scrum practices.
- Experience in leading other developers and guiding them technically.
- Experience in deploying data pipelines using automated CI/CD approach.
- Ability to write modularized reusable code components.
- Proficient in identifying data issues and anomalies during analysis.
- Strong analytical and logical skills.
- Must be able to comfortably tackle new challenges and learn.
- Must have strong verbal and written communication skills.
Required skills:
- Knowledge on GCP
- Expertise in Google BigQuery
- Expertise in Airflow
- Good Hands on SQL
- Data warehousing concepts
About Affine Analytics
Similar jobs
A Delhi NCR based Applied AI & Consumer Tech company tackling one of the largest unsolved consumer internet problems of our time. We are a motley crew of smart, passionate and nice people who believe you can build a high performing company with a culture of respect aka a sports team with a heart aka a caring meritocracy.
Our illustrious angels include unicorn founders, serial entrepreneurs with exits, tech & consumer industry stalwarts and investment professionals/bankers.
We are hiring for our founding team (in Delhi NCR only, no remote) that will take the product from prototype to a landing! Opportunity for disproportionate non-linear impact, learning and wealth creation in a classic 0-1 with a Silicon Valley caliber founding team.
Key Responsibilities:
1. Data Strategy and Vision:
· Develop and drive the company's data analytics strategy, aligning it with overall business goals.
· Define the vision for data analytics, outlining clear objectives and key results (OKRs) to measure success.
2. Data Analysis and Interpretation:
· Oversee the analysis of complex datasets to extract valuable insights, trends, and patterns.
· Utilize statistical methods and data visualization techniques to present findings in a clear and compelling manner to both technical and non-technical stakeholders.
3. Data Infrastructure and Tools:
· Evaluate, select, and implement advanced analytics tools and platforms to enhance data processing and analysis capabilities.
· Collaborate with IT teams to ensure a robust and scalable data infrastructure, including data storage, retrieval, and security protocols.
4. Collaboration and Stakeholder Management:
· Collaborate cross-functionally with teams such as marketing, sales, and product development to identify opportunities for data-driven optimizations.
· Act as a liaison between technical and non-technical teams, ensuring effective communication of data insights and recommendations.
5. Performance Measurement:
· Establish key performance indicators (KPIs) and metrics to measure the impact of data analytics initiatives on business outcomes.
· Continuously assess and improve the accuracy and relevance of analytical models and methodologies.
Qualifications:
- Bachelor's or Master's degree in Data Science, Statistics, Computer Science, or related field.
- Proven experience (5+ years) in data analytics, with a focus on leading analytics teams and driving strategic initiatives.
- Proficiency in data analysis tools such as Python, R, SQL, and advanced knowledge of data visualization tools.
- Strong understanding of statistical methods, machine learning algorithms, and predictive modelling techniques.
- Excellent communication skills, both written and verbal, to effectively convey complex findings to diverse audie
DATA ANALYST
About:
We allows customers to "buy now and pay later" for goods and services purchased online and offline portals. It's a rapidly growing organization opening up new avenues of payments for online and offline customers. |
Role:
Define and continuously refine the analytics roadmap. Build, Deploy and Maintain the data infrastructure that supports all of the analysis, including the data warehouse and various data marts Build, deploy and maintain the predictive models and scoring infrastructure that powers critical decision management systems. Strive to devise ways to gather more alternate data and build increasingly enhanced predictive models Partner with business teams to systematically design experiments to continuously improve customer acquisition, minimize churn, reduce delinquency and improve profitability Provide data insights to all business teams through automated queries, MIS, etc. |
Requirements:
4+ years of deep, hands-on analytics experience in a management consulting, start-up or financial services, or fintech company. Should have strong knowledge in SQL and Python. Deep knowledge of problem-solving approach using analytical frameworks. Deep knowledge of frameworks for data management, deployment, and monitoring of performance metrics. Hands-on exposure to delivering improvements through test and learn methodologies. Excellent communication and interpersonal skills, with the ability to be pleasantly persistent. |
Location-MUMBAI
We are looking for an exceptionally talented Lead data engineer who has exposure in implementing AWS services to build data pipelines, api integration and designing data warehouse. Candidate with both hands-on and leadership capabilities will be ideal for this position.
Qualification: At least a bachelor’s degree in Science, Engineering, Applied Mathematics. Preferred Masters degree
Job Responsibilities:
• Total 6+ years of experience as a Data Engineer and 2+ years of experience in managing a team
• Have minimum 3 years of AWS Cloud experience.
• Well versed in languages such as Python, PySpark, SQL, NodeJS etc
• Has extensive experience in the real-timeSpark ecosystem and has worked on both real time and batch processing
• Have experience in AWS Glue, EMR, DMS, Lambda, S3, DynamoDB, Step functions, Airflow, RDS, Aurora etc.
• Experience with modern Database systems such as Redshift, Presto, Hive etc.
• Worked on building data lakes in the past on S3 or Apache Hudi
• Solid understanding of Data Warehousing Concepts
• Good to have experience on tools such as Kafka or Kinesis
• Good to have AWS Developer Associate or Solutions Architect Associate Certification
• Have experience in managing a team
We are #hiring for AWS Data Engineer expert to join our team
Job Title: AWS Data Engineer
Experience: 5 Yrs to 10Yrs
Location: Remote
Notice: Immediate or Max 20 Days
Role: Permanent Role
Skillset: AWS, ETL, SQL, Python, Pyspark, Postgres DB, Dremio.
Job Description:
Able to develop ETL jobs.
Able to help with data curation/cleanup, data transformation, and building ETL pipelines.
Strong Postgres DB exp and knowledge of Dremio data visualization/semantic layer between DB and the application is a plus.
Sql, Python, and Pyspark is a must.
Communication should be good
About the company:
VakilSearch is a technology-driven platform, offering services that cover the legal needs of startups and established businesses. Some of our services include incorporation, government registrations & filings, accounting, documentation and annual compliances. In addition, we offer a wide range of services to individuals, such as property agreements and tax filings. Our mission is to provide one-click access to individuals and businesses for all their legal and professional needs.
You can learn more about us at https://vakilsearch.com/">vakilsearch.com.
About the role:
A successful data analyst needs to have a combination of technical as well leadership skills. A background in Mathematics, Statistics, Computer Science, Information Management can serve as a solid foundation to build your career as a data analyst at VakilSearch.
Why to join Vakilsearch:
- Unlimited opportunities to grow
- Flat hierarchy
- Encouraging environment to unleash your out of box thinking skills
Responsibilities:
- Preparing reports for the stakeholders and the management, enabling them to take important decisions based on various facts and trends.
- Using automated tools to extract data from primary and secondary sources
- Identify and recommend the right product metrics to be analysed and tracked for every feature/problem statement.
- Using statistical tools to identify, analyze, and interpret patterns and trends in complex data sets that could be helpful for the diagnosis and prediction
- Working with programmers, engineers, and management heads to identify process improvement opportunities, propose system modifications, and devise data governance strategies.
Required skills:
- Bachelor’s degree from an accredited university or college in computer science or graduate from data science related program
- Minimum of 0 - 2 years experience in analysing
• Create and maintain data pipeline
• Build and deploy ETL infrastructure for optimal data delivery
• Work with various including product, design and executive team to troubleshoot data
related issues
• Create tools for data analysts and scientists to help them build and optimise the product
• Implement systems and process for data access controls and guarantees
• Distill the knowledge from experts in the field outside the org and optimise internal data
systems
Preferred qualifications/skills:
• 5+ years experience
• Strong analytical skills
____ 04
Freight Commerce Solutions Pvt Ltd.
• Degree in Computer Science, Statistics, Informatics, Information Systems
• Strong project management and organisational skills
• Experience supporting and working with cross-functional teams in a dynamic environment
• SQL guru with hands on experience on various databases
• NoSQL databases like Cassandra, MongoDB
• Experience with Snowflake, Redshift
• Experience with tools like Airflow, Hevo
• Experience with Hadoop, Spark, Kafka, Flink
• Programming experience in Python, Java, Scala
Dori AI enables enterprises with AI-powered video analytics to significantly increase human productivity and improve process compliance. We leverage a proprietary full-stack end-to-end computer vision and deep learning platform to rapidly build and deploy AI solutions for enterprises. The platform was built with enterprise considerations including time-to-value, time-to-market, security, and scalability across a range of use cases. Capture visual data across multiple sites, leverage AI + Computer Vision to gather key insights, and make decisions with actionable visual insights. Launch CV applications in a matter of weeks that are optimized for both cloud and edge deployments.
Job brief: Sr. Software Engineer/Software Engineer
All of our team members are expected to learn, learn, and learn! We are working on cutting-edge technologies and areas of artificial intelligence that have never been explored before. We are looking for motivated software engineers with strong coding skills that want to work on problems and challenges they have never worked on before. All of our team members wear multiple hats so you will be expected to simultaneously work on multiple aspects of the products we ship.
Responsibilities
- Participate heavily in the brainstorming of system architecture and feature design
- Interface with external customers and key stakeholders to understand and document design requirements
- Work cross-functionally with Engineering, Data Science, Product, UX, and Infrastructure teams
- Drive best coding practices across the company (i.e. documentation, code reviews, coding standards, etc)
- Perform security, legal, and license reviews of committed code
- Complete projects with little or no supervision from senior leadership
Required Qualifications
- Built and deployed customer-facing services and products at scale
- Developed unit and integration tests
- Worked on products where experimentation and data science are core to the development
- Experience with large-scale distributed systems that have thousands of microservices and manages millions of transactions per day
- Solid instruction-level understanding of Object Oriented design, data structures, and software engineering principles
- Must have at least 4+ years of experience in back-end web development with the following tools: Python, Flask, FastAPI, AWS or Azure, GCP, Java or C/C++, ORM, Mongo, Postgres, TimescaleD, CI/CD
Desired Experience/Skills
- You have a strong background in software development
- Experience with the following tools: Google Cloud Platform, Objective C/Swift, Github, Docker
- Experience with open-source projects in a startup environment
- BS, MS, or Ph.D. in Computer Science, Software Engineering, Math, Electrical Engineering, or other STEM degree
Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools:
● Experience with big data tools: Hive/Hadoop, Spark, Kafka, Hive etc.
● Experience with querying multiple databases SQL/NoSQL, including
Oracle, MySQL and MongoDB etc.
● Experience in Redis, RabbitMQ, Elastic Search is desirable.
● Strong Experience with object-oriented/functional/ scripting languages:
Python(preferred), Core Java, Java Script, Scala, Shell Scripting etc.
● Must have debugging complex code skills, experience on ML/AI
algorithms is a plus.
● Experience in version control tool Git or any is mandatory.
● Experience with AWS cloud services: EC2, EMR, RDS, Redshift, S3
● Experience with stream-processing systems: Storm, Spark-Streaming,
etc
Work days- Sun-Thu
Day shift
• Total of 4+ years of experience in development, architecting/designing and implementing Software solutions for enterprises.
• Must have strong programming experience in either Python or Java/J2EE.
• Minimum of 4+ year’s experience working with various Cloud platforms preferably Google Cloud Platform.
• Experience in Architecting and Designing solutions leveraging Google Cloud products such as Cloud BigQuery, Cloud DataFlow, Cloud Pub/Sub, Cloud BigTable and Tensorflow will be highly preferred.
• Presentation skills with a high degree of comfort speaking with management and developers
• The ability to work in a fast-paced, work environment
• Excellent communication, listening, and influencing skills
RESPONSIBILITIES:
• Lead teams to implement and deliver software solutions for Enterprises by understanding their requirements.
• Communicate efficiently and document the Architectural/Design decisions to customer stakeholders/subject matter experts.
• Opportunity to learn new products quickly and rapidly comprehend new technical areas – technical/functional and apply detailed and critical thinking to customer solutions.
• Implementing and optimizing cloud solutions for customers.
• Migration of Workloads from on-prem/other public clouds to Google Cloud Platform.
• Provide solutions to team members for complex scenarios.
• Promote good design and programming practices with various teams and subject matter experts.
• Ability to work on any product on the Google cloud platform.
• Must be hands-on and be able to write code as required.
• Ability to lead junior engineers and conduct code reviews
QUALIFICATION:
• Minimum B.Tech/B.E Engineering graduate