
● Create and maintain optimal data pipeline architecture.
● Assemble large, complex data sets that meet functional / non-functional
business requirements.
● Building and optimizing ‘big data’ data pipelines, architectures and data sets.
● Maintain, organize & automate data processes for various use cases.
● Identifying trends, doing follow-up analysis, preparing visualizations.
● Creating daily, weekly and monthly reports of product KPIs.
● Create informative, actionable and repeatable reporting that highlights
relevant business trends and opportunities for improvement.
Required Skills And Experience:
● 2-5 years of work experience in data analytics- including analyzing large data sets.
● BTech in Mathematics/Computer Science
● Strong analytical, quantitative and data interpretation skills.
● Hands-on experience with Python, Apache Spark, Hadoop, NoSQL
databases(MongoDB preferred), Linux is a must.
● Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
● Experience with Google Cloud Data Analytics Products such as BigQuery, Dataflow, Dataproc etc. (or similar cloud-based platforms).
● Experience working within a Linux computing environment, and use of
command-line tools including knowledge of shell/Python scripting for
automating common tasks.
● Previous experience working at startups and/or in fast-paced environments.
● Previous experience as a data engineer or in a similar role.

About Top startup of India - News App
Similar jobs
Preferred Education & Experience:
•Bachelor’s or master’s degree in Computer Engineering, Computer Science, Computer Applications, Mathematics, Statistics or related technical field or equivalent practical experience.
Well-versed in and 5+ years of hands-on demonstrable experience with:
▪Data Analysis & Data Modeling
Database Design & Implementation
Database Performance Tuning & Optimization
▪PL/pgSQL & SQL
•5+ years of hands-on development experience in Relational Database (PostgreSQL/SQL
Server/Oracle).
•5+ years of hands-on development experience in SQL, PL/PgSQL, including stored procedures,
functions, triggers, and views.
Hands-on experience with demonstrable working experience in Database Design Principles, SQL
Query Optimization Techniques, Index Management, Integrity Checks, Statistics, and Isolation
levels.
Hands-on experience with demonstrable working experience in Database Read & Write
Performance Tuning & Optimization.
•Knowledge and Experience working in Domain Driven Design (DDD) Concepts, Object Oriented
Programming System (OOPS) Concepts, Cloud Architecture Concepts, NoSQL Database Concepts
are added values
•Knowledge and working experience in Oil & Gas, Financial, & Automotive Domains is a plus
Hands-on development experience in one or more NoSQL datastores such as Cassandra, HBase,
MongoDB, DynamoDB, Elastic Search, Neo4J, etc. a plus
Job Location : Pune/Remote
Work Timings : 2.30 pm-11:30 pm
Joining Period : Immediate-20 day
CORE RESPONSIBILITIES
- Create and manage cloud resources in AWS
- Data ingestion from different data sources which exposes data using different technologies, such as: RDBMS, REST HTTP API, flat files, Streams, and Time series data based on various proprietary systems. Implement data ingestion and processing with the help of Big Data technologies
- Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part of business logic and implement it using the language supported by the base data platform
- Develop automated data quality check to make sure right data enters the platform and verifying the results of the calculations
- Develop an infrastructure to collect, transform, combine and publish/distribute customer data.
- Define process improvement opportunities to optimize data collection, insights and displays.
- Ensure data and results are accessible, scalable, efficient, accurate, complete and flexible
- Identify and interpret trends and patterns from complex data sets
- Construct a framework utilizing data visualization tools and techniques to present consolidated analytical and actionable results to relevant stakeholders.
- Key participant in regular Scrum ceremonies with the agile teams
- Proficient at developing queries, writing reports and presenting findings
- Mentor junior members and bring best industry practices
QUALIFICATIONS
- 5-7+ years’ experience as data engineer in consumer finance or equivalent industry (consumer loans, collections, servicing, optional product, and insurance sales)
- Strong background in math, statistics, computer science, data science or related discipline
- Advanced knowledge one of language: Java, Scala, Python, C#
- Production experience with: HDFS, YARN, Hive, Spark, Kafka, Oozie / Airflow, Amazon Web Services (AWS), Docker / Kubernetes, Snowflake
- Proficient with
- Data mining/programming tools (e.g. SAS, SQL, R, Python)
- Database technologies (e.g. PostgreSQL, Redshift, Snowflake. and Greenplum)
- Data visualization (e.g. Tableau, Looker, MicroStrategy)
- Comfortable learning about and deploying new technologies and tools.
- Organizational skills and the ability to handle multiple projects and priorities simultaneously and meet established deadlines.
- Good written and oral communication skills and ability to present results to non-technical audiences
- Knowledge of business intelligence and analytical tools, technologies and techniques.
Mandatory Requirements
- Experience in AWS Glue
- Experience in Apache Parquet
- Proficient in AWS S3 and data lake
- Knowledge of Snowflake
- Understanding of file-based ingestion best practices.
- Scripting language - Python & pyspark
JD Code: SHI-LDE-01
Version#: 1.0
Date of JD Creation: 27-March-2023
Position Title: Lead Data Engineer
Reporting to: Technical Director
Location: Bangalore Urban, India (on-site)
SmartHub.ai (www.smarthub.ai) is a fast-growing Startup headquartered in Palo Alto, CA, and with offices in Seattle and Bangalore. We operate at the intersection of AI, IoT & Edge Computing. With strategic investments from leaders in infrastructure & data management, SmartHub.ai is redefining the Edge IoT space. Our “Software Defined Edge” products help enterprises rapidly accelerate their Edge Infrastructure Management & Intelligence. We empower enterprises to leverage their Edge environment to increase revenue, efficiency of operations, manage safety and digital risks by using Edge and AI technologies.
SmartHub is an equal opportunity employer and will always be committed to nurture a workplace culture that supports, inspires and respects all individuals, encourages employees to bring their best selves to work, laugh and share. We seek builders who hail from a variety of backgrounds, perspectives and skills to join our team.
Summary
This role requires the candidate to translate business and product requirements to build, maintain, optimize data systems which can be relational or non-relational in nature. The candidate is expected to tune and analyse the data including from a short and long-term trend analysis and reporting, AI/ML uses cases.
We are looking for a talented technical professional with at least 8 years of proven experience in owning, architecting, designing, operating and optimising databases that are used for large scale analytics and reports.
Responsibilities
- Provide technical & architectural leadership for the next generation of product development.
- Innovate, Research & Evaluate new technologies and tools for a quality output.
- Architect, Design and Implement ensuring scalability, performance and security.
- Code and implement new algorithms to solve complex problems.
- Analyze complex data, develop, optimize and transform large data sets both structured and unstructured.
- Ability to deploy and administrator the database and continuously tuning for performance especially container orchestration stacks such as Kubernetes
- Develop analytical models and solutions Mentor Junior members technically in Architecture, Designing and robust Coding.
- Work in an Agile development environment while continuously evaluating and improvising engineering processes
Required
- At least 8 years of experience with significant depth in designing and building scalable distributed database systems for enterprise class products, experience of working in product development companies.
- Should have been feature/component lead for several complex features involving large datasets.
- Strong background in relational and non-relational database like Postgres, MongoDB, Hadoop etl.
- Deep exp database optimization, tuning ertise in SQL, Time Series Databases, Apache Drill, HDFS, Spark are good to have
- Excellent analytical and problem-solving skill sets.
- Experience in for high throughput is highly desirable
- Exposure to database provisioning in Kubernetes/non-Kubernetes environments, configuration and tuning in a highly available mode.
- Demonstrated ability to provide technical leadership and mentoring to the team
Job Description
Condé Nast has a huge global footprint across 32 countries worldwide with total monthly visitors reaching more than 550 million. We aim to inspire, inform and entertain our audiences through our portfolio of well known brands such as Vogue, Wired, Vanity Fair, Architectural Digest, GQ and many other leading brands.
Condé Nast has operations in America, Europe, South America and Asia. There are long-established operations in major Indian cities such as Mumbai and Chennai across many business areas.
The company is undertaking the biggest transformation in its history, driving towards expanding both its operations and its reach through diversifying the digital business models for our brands. Major areas of focus will include video and streaming platforms, increasing subscriptions and memberships, growing our events business, and ensuring we maintain a high level of standard on the products we are already proud of delivering to our customers. Migrating and consolidating to a more centralised set of technologies and capabilities that can be easily leveraged by any brand or product experience is also a top business priority.
Engineering is responsible for scaling, building, deploying websites that serve over 500 million users globally. The global engineering team includes Product Engineering, Platform Engineering and Core Engineering.
About Brand Engineering
Our mission is to build compelling and inspiring experiences across our portfolio of global brands such as Vogue, Wired, Vanity Fair, and Glamour which currently reaches over 500 million monthly users. Our goals are extremely ambitious; we are rolling out our brands into new countries and territories and pushing to serve more consumers across web, native (iOS / Android), video and social platforms.
We want you to join us as we embark on further global expansion of our brands, creation of new features to deliver our engaging content to consumers worldwide, and diversifying our revenue models. We believe in fostering a culture of experimentation and innovation, often running hundreds of a/b experiments in production every day!
You will be working alongside international colleagues across US, London, and India. Cross-functional collaboration is at the heart of what we do and you will be comfortable working in close partnership with Product, Design, UX, and Data. The Brand Engineering group works in close partnership with the Platform Engineering group to implement cross-cutting solutions that can be leveraged by the Brands to deliver robust end-to-end experiences for our consumers.
The Role
We’re looking for a Senior JavaScript engineer to join our team at our Chennai/Bengaluru office.We ideally hire engineers who are comfortable across the full stack, but we know you'll have a preference about being on the front- or back-end. As long as you're happy to work on both sets of tasks – you should carry on reading!
Our tech
-
Languages and Frameworks: Javascript, Node.JS, React (and React eco-system)
-
APIs: GraphQL
-
Data: Postgres, MongoDB, Elasticsearch
-
Deployment: Docker, AWS, Kubernetes
-
Source Control: Git, GitHub
How we work
-
Infrastructure as Code everywhere
-
Pairing. We like knowledge sharing and upskilling
-
Remote friendly. We work with engineers across time zones & locations
-
(Oncall - teams are responsible for their own apps)
Role and responsibilities
-
Working on our collection of Node apps; we use a mix of Express, Hapi, and Fastify
-
Working on our front-end apps; we use a mix of React and Next.js but as long as you have experience in at least one framework we don’t really mind
-
Shipping features and services to anywhere between thousands to millions of customers per day.
-
Collaboratively working with our amazing infrastructure team
-
Mentoring colleagues at every level, and a willingness to be mentored by them
-
Supporting your Tech Lead and Engineering Manager in technical decision making, solutionizing and estimations.
-
Pairing with members of your team to identify the most optimal solution. Regularly conducting code reviews to ensure high standards of engineering quality in the code that we write.
-
Working within a cross-functional team that includes designers, UX engineers, and product
-
Looking at logs and monitoring metrics and responding to alerts; we use Kibana, ElasticSearch and Datadog extensively, but as long as you’re aware of how important observability is, we don’t mind what tools you may have used in the past.
-
Taking part in our 24/7 “On Call” rota to troubleshoot any major operational issues out of hours. Teams are responsible for their own apps availability and reliability.
About you - Essential skills
-
You have a BE / BTech or equivalent experience
-
You’re a software engineer with 7-9 years of experience.
-
Expert knowledge of JavaScript and Node.js, good understanding of ES9 and React.js and JavaScript testing frameworks (such as Jest, Mocha etc.)
-
Good understanding of Cloud Native architecture, containerisation, Docker, AWS, CI/CD, and DevOps culture. Kubernetes foundational knowledge would be a great bonus, as we use it extensively.
-
Practical experience in the use of leading engineering practices and principles.
-
Practical experience of building robust solutions at large scale. We serve traffic in the many millions every month so our products need to scale seamlessly to meet our customer demands.
-
Appreciation for functions of Product and Design, experience working in cross-functional teams.
-
You’ll try to make the codebase nicer for the next person that visits it.
-
You’re someone who’ll own every step of the development process; from refining tickets to shipping the code to production.
-
You’ll do your best, and ask for help when it’s needed.
-
You have an interest or desire to learn about every part of the stack; from the latest JavaScript standards, to GraphQL, to accessibility, and the infrastructure we deploy to (we use Kubernetes extensively, but we have a dedicated platform team on hand to help!)
-
Someone who can communicate in a variety of media (through Slack and in person) effectively within their team, outside of their team, and with people in our product and design families, to even our users
-
Able to effectively communicate technical concepts to different audiences
-
Business proficiency in spoken and written English
-
You will be working with a global team and need to be accommodative of different time-zones as required
About you - Desirable skills
-
Experience of working with international teams across multiple time-zones
-
Mentoring and/or management experience
-
Knowing about continuous integration, testing strategies, design systems, software architecture, data and analytics, user experience, accessibility, internationalisation, web performance
-
Experience and/or interest in working with digital advertising
-
Experience and/or interest in working in publishing
-
Experience and/or interest in working with Fashion
-
Proficiency working with a variety of third party APIs and developer tools
-
Commercial experience using Kubernetes, React and GraphQL
We are looking for candidates who have demonstrated both a strong business sense and deep understanding of the quantitative foundations of modelling.
• Excellent analytical and problem-solving skills, including the ability to disaggregate issues, identify root causes and recommend solutions
• Statistical programming software experience in SPSS and comfortable working with large data sets.
• R, Python, SAS & SQL are preferred but not a mandate
• Excellent time management skills
• Good written and verbal communication skills; understanding of both written and spoken English
• Strong interpersonal skills
• Ability to act autonomously, bringing structure and organization to work
• Creative and action-oriented mindset
• Ability to interact in a fluid, demanding and unstructured environment where priorities evolve constantly, and methodologies are regularly challenged
• Ability to work under pressure and deliver on tight deadlines
Qualifications and Experience:
• Graduate degree in: Statistics/Economics/Econometrics/Computer
Science/Engineering/Mathematics/MBA (with a strong quantitative background) or
equivalent
• Strong track record work experience in the field of business intelligence, market
research, and/or Advanced Analytics
• Knowledge of data collection methods (focus groups, surveys, etc.)
• Knowledge of statistical packages (SPSS, SAS, R, Python, or similar), databases,
and MS Office (Excel, PowerPoint, Word)
• Strong analytical and critical thinking skills
• Industry experience in Consumer Experience/Healthcare a plus
• 2 - 5 years of experience building React and/or Mobile Applications
• 5-8 years working with microservices, API servers, databases, cloud-native development, observability,
alerting, and monitoring
• Deep exposure to cloud services, preferably Azure
• Preferably worked in the Finance/Retail domain or other similar domains with complex business
requirements.
• Hands-on skills combined with leadership qualities to guide teams.
Location – Bangalore, Mumbai, Gurgaon
Functional / Technical Skills:
• Strong understanding of networking fundamentals
o OSI Stack, DNS, TCP protocols
o Browser rendering and various stages of execution
• Good understanding of RESTful APIs, GraphQL and Web Sockets
• Ability to debug and profile Web/Mobile applications with Chrome DevTools or Native profilers
• Strong understanding of Distributed Systems, Fault Tolerance and Resiliency. Exposure to setting up
and managing Chaos is a plus.
• Exposure to Domain Driven Design (DDD), SOLID principles, and Data Modelling on various RDBMS,
NoSQL databases.
• Ability to define and document performance goals, SLAs, and volumetrics. Creating a framework for
measuring and validating the goals. Work with teams to implement and meet them.
• Create automation scripts to measure performance. Making this part of the CI/CD process.
• Good understanding of CNCF projects with a specific focus on Observability, Monitoring, Tracing,
Sidecars, Kubernetes
• Tuning of Cloud-native deployments with a focus on Cost Optimization.
• Participate in architecture reviews to identify potential issues, and bottlenecks and provide early guidance.
• Deep knowledge of at least 2 different programming languages and runtimes. Any two of Ruby, Python,
Swift, Go, Rust, C#, Dart, Kotlin, Java, Haskell, OCaml
• Excellent verbal and written communication
• A mindset to constantly learn new things and challenge the Status Quo.
Come Join Qualifyde’s Interviewing world and be part of the next generation of technical interviews.
QUALIFYDE Tech nerds are a network of software professionals, including software development managers, software engineers, and freelancers covering the full technology stack.
Technical Interviews with QUALIFYDE occur 24/7 with candidates located throughout the world
Conduct interviews based on your own availability — whether that be five times a week, 10 times a week, mornings, nights, or weekends.
Utilize the QUALIFYDE Interviewing infrastructure and process to conduct fair, equitable technical interviews in a wide variety of roles.
Who can apply; -
- Software developers/engineers with hands-on coding experience and experience conducting technical interviews
- Strong Communication Skills in English (Oral & Written)
- Stable, Consistent internet connection
- Equipped with a working webcam
- Design, create, test, and maintain data pipeline architecture in collaboration with the Data Architect.
- Build the infrastructure required for extraction, transformation, and loading of data from a wide variety of data sources using Java, SQL, and Big Data technologies.
- Support the translation of data needs into technical system requirements. Support in building complex queries required by the product teams.
- Build data pipelines that clean, transform, and aggregate data from disparate sources
- Develop, maintain and optimize ETLs to increase data accuracy, data stability, data availability, and pipeline performance.
- Engage with Product Management and Business to deploy and monitor products/services on cloud platforms.
- Stay up-to-date with advances in data persistence and big data technologies and run pilots to design the data architecture to scale with the increased data sets of consumer experience.
- Handle data integration, consolidation, and reconciliation activities for digital consumer / medical products.
Job Qualifications:
- Bachelor’s or master's degree in Computer Science, Information management, Statistics or related field
- 5+ years of experience in the Consumer or Healthcare industry in an analytical role with a focus on building on data pipelines, querying data, analyzing, and clearly presenting analyses to members of the data science team.
- Technical expertise with data models, data mining.
- Hands-on Knowledge of programming languages in Java, Python, R, and Scala.
- Strong knowledge in Big data tools like the snowflake, AWS Redshift, Hadoop, map-reduce, etc.
- Having knowledge in tools like AWS Glue, S3, AWS EMR, Streaming data pipelines, Kafka/Kinesis is desirable.
- Hands-on knowledge in SQL and No-SQL database design.
- Having knowledge in CI/CD for the building and hosting of the solutions.
- Having AWS certification is an added advantage.
- Having Strong knowledge in visualization tools like Tableau, QlikView is an added advantage
- A team player capable of working and integrating across cross-functional teams for implementing project requirements. Experience in technical requirements gathering and documentation.
- Ability to work effectively and independently in a fast-paced agile environment with tight deadlines
- A flexible, pragmatic, and collaborative team player with the innate ability to engage with data architects, analysts, and scientists
We are looking for a self motivated and passionate individual, with strong desire to learn and ability to lead. This position is for a Flight Test Engineer, with exposure to building and flying sUAS (RC Multirotors and Fixed wings). See the detailed job description below.
Responsibilities
• Plan and execute flight test plans for new software features, electronics, sensors, and payloads.
• Perform hands-on mechanical and electrical integration of new hardware components on the internal fleet of test vehicles for R&D and testing.
• Troubleshoot and debug any components of a drone in the office or in the field. Maintenance of vehicles – keep the fleet ready for flight tests.
• Participate in defining and validating customer workflows and enhancing User experience.
• Coordinate cross-team efforts among FlytBase engineers to resolve issues identified during flight tests.
• Drive collaboration with FlytBase Developer team, Business Development team and Customer Support team to incorporate customer feedback and feature requests into FlytBase’s product development cycle.
• Learn about the domain and competitors to propose new drone applications, as well as, improvements in existing applications
Experience/Skills
• Experience in flight testing and operating/piloting small UAS and/or RC aircraft (both fixed-wing and multirotor systems).
• Experience in using flight-planning and ground control station software.
•Familiarity with UAV platforms, like, Pixhawk, DJI, Ardupilot and PX4.
•Experience in integrating, operating, and tuning autopilots on a variety of unmanned vehicles.
•Basic knowledge of electrical test equipment (multimeter, oscilloscope) and UAS sensors.
•Ability to work hands-on with electro-mechanical systems including assembly, disassembly, testing, troubleshooting.
•Good verbal and written communication skills.
Good to have
• RF communications fundamentals.
• Passionate about aerial robots i.e. drones.
• Programming languages and scripting for engineering use (C++, C, MATLAB, Python).
Compensation:
As per industry standards.
Perks:
+ Fast-paced Startup culture
+ Hacker mode environment
+ Great team
+ Flexible work hours
+ Informal dress code
+ Free snacks

