11+ Data flow Jobs in Delhi, NCR and Gurgaon | Data flow Job openings in Delhi, NCR and Gurgaon
Apply to 11+ Data flow Jobs in Delhi, NCR and Gurgaon on CutShort.io. Explore the latest Data flow Job opportunities across top companies like Google, Amazon & Adobe.
● Create and maintain optimal data pipeline architecture.
● Assemble large, complex data sets that meet functional / non-functional
business requirements.
● Building and optimizing ‘big data’ data pipelines, architectures and data sets.
● Maintain, organize & automate data processes for various use cases.
● Identifying trends, doing follow-up analysis, preparing visualizations.
● Creating daily, weekly and monthly reports of product KPIs.
● Create informative, actionable and repeatable reporting that highlights
relevant business trends and opportunities for improvement.
Required Skills And Experience:
● 2-5 years of work experience in data analytics- including analyzing large data sets.
● BTech in Mathematics/Computer Science
● Strong analytical, quantitative and data interpretation skills.
● Hands-on experience with Python, Apache Spark, Hadoop, NoSQL
databases(MongoDB preferred), Linux is a must.
● Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
● Experience with Google Cloud Data Analytics Products such as BigQuery, Dataflow, Dataproc etc. (or similar cloud-based platforms).
● Experience working within a Linux computing environment, and use of
command-line tools including knowledge of shell/Python scripting for
automating common tasks.
● Previous experience working at startups and/or in fast-paced environments.
● Previous experience as a data engineer or in a similar role.
- KSQL
- Data Engineering spectrum (Java/Spark)
- Spark Scala / Kafka Streaming
- Confluent Kafka components
- Basic understanding of Hadoop
Who Are We
A research-oriented company with expertise in computer vision and artificial intelligence, at its core, Orbo is a comprehensive platform of AI-based visual enhancement stack. This way, companies can find a suitable product as per their need where deep learning powered technology can automatically improve their Imagery.
ORBO's solutions are helping BFSI, beauty and personal care digital transformation and Ecommerce image retouching industries in multiple ways.
WHY US
- Join top AI company
- Grow with your best companions
- Continuous pursuit of excellence, equality, respect
- Competitive compensation and benefits
You'll be a part of the core team and will be working directly with the founders in building and iterating upon the core products that make cameras intelligent and images more informative.
To learn more about how we work, please check out
https://www.orbo.ai/">https://www.orbo.ai/.
We at Orbo are looking for developers who have passion for technology, ability to deploy new and novel technologies in environments that are often low on humans as well as capital. From solving complex algorithm problems, software development engineers also develop complex algorithms to solve real-life crises and are involved in the system design of several applications and thus contributing to their technical architecture.
Responsibilities:
- To determine the scope of software development projects.
- Collaborating with the software development team on application design and development.
- Developing software and overseeing the deployment of applications across platforms.
- Performing diagnostic tests and debugging procedures.
- Creating end-user application feedback channels.
- Optimizing software by performing maintenance, updates, and upgrades.
- Documenting processes and maintaining software development records.
- Keeping up to date with C++ standards and advancements in application development.
Requirements:
- Bachelor's degree in computer science, information systems, or similar.
- Strong understanding of OOPs, SDL.
- Problem solving with data structure and algorithms.
- Build tools(make/Cmake/Ninja/Bazel)
- Experience with OS & hardware specific development (SIMD, AVX, AVX2)
- Having knowledge of 3rd Party library integration (OpenCV, Tensorflow, NCNN, TNN, liptorch, Openvino, ONNX runtime)
- Worked with Modern C++
- Worked with Linux & Windows
- Experience with multi threading and multi processing
- Superb analytical and problem-solving skills.
- Excellent collaboration and communication skills.
- Great organizational and time management skills.
About this role
We are seeking a seasoned DBA to join our team. You will have extensive experience in a mission-critical environment as a database administrator over a successful career. You will primarily work on the Supporting our Databases, AWS RDS (MySQL), and MongoDB/ Cassandra.
What You’ll Do
- Provide leadership in database scaling & monitoring. Maintain runbooks for our 24x7 SRE team.
- Assist in troubleshooting and resolution of database issues. Perform database upgrades and patches.
- Define and develop data pipelines between platforms in a mission-critical environment using DMS, Fivetran, Kafka, etc.
- Work with our developers to optimize and tune their SQL queries.
- Assist in application development, debugging, and optimization with respect to data concerns.
- Provide input in the design and implementation of backup, recovery, and DR strategy
- Review application and database design for compliance.
What You’ll Need
- Ability to script in a high-level language like Python, Ruby, etc for Automation.
- 8+ years of experience in AWS database services like RDS (Mysql), MongoDB/ Cassandra.
- 4+ years of experience in database design and DevOps on AWS.
- Must demonstrate a clear understanding of logical and physical database design and standards.
- Experience in columnar data warehouse solutions like Redshift or Snowflake.
- Must have extensive hands-on experience with MySQL in a large scale 24x7 production environment with millions of records.
- Bachelor of Science in Computer Science, Mathematics, or Engineering; or equivalent work experience
Bonus Points If You Have
- Experience with NoSQL solutions such as Cassandra/MongoDB etc.
- Experience with software development life cycle (SDLC) in an agile environment
- Hands-on production experience with Big Data applications such as Spark.
- Knowledge of development best practices (source control with Git, continuous integration, automated testing).
JD:
Required Skills:
- Intermediate to Expert level hands-on programming using one of programming language- Java or Python or Pyspark or Scala.
- Strong practical knowledge of SQL.
Hands on experience on Spark/SparkSQL - Data Structure and Algorithms
- Hands-on experience as an individual contributor in Design, Development, Testing and Deployment of Big Data technologies based applications
- Experience in Big Data application tools, such as Hadoop, MapReduce, Spark, etc
- Experience on NoSQL Databases like HBase, etc
- Experience with Linux OS environment (Shell script, AWK, SED)
- Intermediate RDBMS skill, able to write SQL query with complex relation on top of big RDMS (100+ table)