Cutshort logo
Greenway Health logo
Data engineer
Greenway Health's logo

Data engineer

6 - 8 yrs
₹8L - ₹15L / yr
Bengaluru (Bangalore)
Skills
Spark
Hadoop
Big Data
Data engineering
PySpark
skill iconPython
AWS Lambda
SQL
hadoop
kafka
6-8years of experience in data engineer
Spark
Hadoop
Big Data
Data engineering
PySpark
Python
AWS Lambda
SQL
hadoop
kafka
Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About Greenway Health

Founded :
2012
Type
Size
Stage :
Profitable
About
Our EHR software and services connect providers to the right information and insights so they can make patient-driven care a reality.
Read more
Connect with the team
Profile picture
Gayathri Rajan
Profile picture
Padmashree S
Company social profiles
bloginstagramlinkedintwitterfacebook

Similar jobs

Archwell
Agency job
via AVI Consulting LLP by Sravanthi Puppala
Mysore
2 - 8 yrs
₹1L - ₹15L / yr
Snowflake
skill iconPython
SQL
skill iconAmazon Web Services (AWS)
Windows Azure
+6 more

Title:  Data Engineer – Snowflake

 

Location: Mysore (Hybrid model)

Exp-2-8 yrs

Type: Full Time

Walk-in date: 25th Jan 2023 @Mysore 

 

Job Role: We are looking for an experienced Snowflake developer to join our team as a Data Engineer who will work as part of a team to help design and develop data-driven solutions that deliver insights to the business. The ideal candidate is a data pipeline builder and data wrangler who enjoys building data-driven systems that drive analytical solutions and building them from the ground up. You will be responsible for building and optimizing our data as well as building automated processes for production jobs. You will support our software developers, database architects, data analysts and data scientists on data initiatives

 

Key Roles & Responsibilities:

  • Use advanced complex Snowflake/Python and SQL to extract data from source systems for ingestion into a data pipeline.
  • Design, develop and deploy scalable and efficient data pipelines.
  • Analyze and assemble large, complex datasets that meet functional / non-functional business requirements.
  • Identify, design, and implement internal process improvements. For example: automating manual processes, optimizing data delivery, re-designing data platform infrastructure for greater scalability.
  • Build required infrastructure for optimal extraction, loading, and transformation (ELT) of data from various data sources using AWS and Snowflake leveraging Python or SQL technologies.
  • Monitor cloud-based systems and components for availability, performance, reliability, security and efficiency
  • Create and configure appropriate cloud resources to meet the needs of the end users.
  • As needed, document topology, processes, and solution architecture.
  • Share your passion for staying on top of tech trends, experimenting with and learning new technologies

 

Qualifications & Experience

Qualification & Experience Requirements:

  • Bachelor's degree in computer science, computer engineering, or a related field.
  • 2-8 years of experience working with Snowflake
  • 2+ years of experience with the AWS services.
  • Candidate should able to write the stored procedure and function in Snowflake.
  • At least 2 years’ experience in snowflake developer.
  • Strong SQL Knowledge.
  • Data injection in snowflake using Snowflake procedure.
  • ETL Experience is Must (Could be any tool)
  • Candidate should be aware of snowflake architecture.
  • Worked on the Migration project
  • DW Concept (Optional)
  • Experience with cloud data storage and compute components including lambda functions, EC2s, containers.
  • Experience with data pipeline and workflow management tools: Airflow, etc.
  • Experience cleaning, testing, and evaluating data quality from a wide variety of ingestible data sources
  • Experience working with Linux and UNIX environments.
  • Experience with profiling data, with and without data definition documentation
  • Familiar with Git
  • Familiar with issue tracking systems like JIRA (Project Management Tool) or Trello.
  • Experience working in an agile environment.

Desired Skills:

  • Experience in Snowflake. Must be willing to be Snowflake certified in the first 3 months of employment.
  • Experience with a stream-processing system: Snowpipe
  • Working knowledge of AWS or Azure
  • Experience in migrating from on-prem to cloud systems
Read more
It's a deep-tech firm
Agency job
via wrackle by Naveen Taalanki
Bengaluru (Bangalore)
3 - 10 yrs
₹5L - ₹20L / yr
skill iconData Science
skill iconPython
Natural Language Processing (NLP)
skill iconDeep Learning
TensorFlow
+2 more
  • Your responsibilities:
  • Build, improve and extend NLP capabilities
  • Research and evaluate different approaches to NLP problems
  • Must be able to write code that is well designed, produce deliverable results
  • Write code that scales and can be deployed to production
You must have:
  • Fundamentals of statistical methods is a must
  • Experience in named entity recognition, POS Tagging, Lemmatization, vector representations of textual data and neural networks - RNN, LSTM
  • A solid foundation in Python, data structures, algorithms, and general software development skills.
  • Ability to apply machine learning to problems that deal with language
  • Engineering ability to build robustly scalable pipelines
  • Ability to work in a multi-disciplinary team with a strong product focus
Read more
AI-powered cloud-based SaaS solution
Bengaluru (Bangalore)
2 - 10 yrs
₹15L - ₹50L / yr
Data engineering
Big Data
Data Engineer
Big Data Engineer
Hibernate (Java)
+18 more
Responsibilities

● Able contribute to the gathering of functional requirements, developing technical
specifications, and project & test planning
● Demonstrating technical expertise, and solving challenging programming and design
problems
● Roughly 80% hands-on coding
● Generate technical documentation and PowerPoint presentations to communicate
architectural and design options, and educate development teams and business users
● Resolve defects/bugs during QA testing, pre-production, production, and post-release
patches
● Work cross-functionally with various bidgely teams including: product management,
QA/QE, various product lines, and/or business units to drive forward results

Requirements
● BS/MS in computer science or equivalent work experience
● 2-4 years’ experience designing and developing applications in Data Engineering
● Hands-on experience with Big data Eco Systems.
● Hadoop,Hdfs,Map Reduce,YARN,AWS Cloud, EMR, S3, Spark, Cassandra, Kafka,
Zookeeper
● Expertise with any of the following Object-Oriented Languages (OOD): Java/J2EE,Scala,
Python
● Strong leadership experience: Leading meetings, presenting if required
● Excellent communication skills: Demonstrated ability to explain complex technical
issues to both technical and non-technical audiences
● Expertise in the Software design/architecture process
● Expertise with unit testing & Test-Driven Development (TDD)
● Experience on Cloud or AWS is preferable
● Have a good understanding and ability to develop software, prototypes, or proofs of
concepts (POC's) for various Data Engineering requirements.
Read more
Bengaluru (Bangalore)
6 - 15 yrs
₹40L - ₹90L / yr
skill iconData Science
skill iconDeep Learning
Data Scientist
skill iconMachine Learning (ML)
Artificial Neural Network (ANN)
+9 more

Responsibilities

  • Building out and manage a young data science vertical within the organization

  • Provide technical leadership in the areas of machine learning, analytics, and data sciences

  • Work with the team and create a roadmap to solve the company’s requirements by solving data-mining, analytics, and ML problems by Identifying business problems that could be solved using Data Science and scoping it out end to end.

  • Solve business problems by applying advanced Machine Learning algorithms and complex statistical models on large volumes of data.

  • Develop heuristics, algorithms, and models to deanonymize entities on public blockchains

  • Data Mining - Extend the organization’s proprietary dataset by introducing new data collection methods and by identifying new data sources.

  • Keep track of the latest trends in cryptocurrency usage on open-web and dark-web and develop counter-measures to defeat concealment techniques used by criminal actors.

  • Develop in-house algorithms to generate risk scores for blockchain transactions.

  • Work with data engineers to implement the results of your work.

  • Assemble large, complex data sets that meet functional / non-functional business requirements.

  • Build, scale and deploy holistic data science products after successful prototyping.

  • Clearly articulate and present recommendations to business partners, and influence future plans based on insights.

 

Preferred Experience

 

  • >8+ years of relevant experience as a Data Scientist or Analyst. A few years of work experience solving NLP problems or other ML problems is a plus

  • Must have previously managed a team of at least 5 data scientists or analysts or demonstrate that they have prior experience in scaling a data science function from the ground 

  • Good understanding of python, bash scripting, and basic cloud platform skills (on GCP or AWS)

  • Excellent communication skills and analytical skills

What you’ll get

  • Work closely with the Founders in helping grow the organization to the next level alongside some of the best and brightest talents around you

  • An excellent culture, we encourage collaboration, growth, and learning amongst the team

  • Competitive salary and equity

  • An autonomous and flexible role where you will be trusted with key tasks.

  • An opportunity to have a real impact and be part of a company with purpose.

Read more
Bengaluru (Bangalore), Gurugram
1 - 7 yrs
₹4L - ₹10L / yr
skill iconPython
skill iconR Programming
SAS
Surveying
skill iconData Analytics
+2 more

Desired Skills & Mindset:


We are looking for candidates who have demonstrated both a strong business sense and deep understanding of the quantitative foundations of modelling.


• Excellent analytical and problem-solving skills, including the ability to disaggregate issues, identify root causes and recommend solutions

• Statistical programming software experience in SPSS and comfortable working with large data sets.

• R, Python, SAS & SQL are preferred but not a mandate

• Excellent time management skills

• Good written and verbal communication skills; understanding of both written and spoken English

• Strong interpersonal skills

• Ability to act autonomously, bringing structure and organization to work

• Creative and action-oriented mindset

• Ability to interact in a fluid, demanding and unstructured environment where priorities evolve constantly, and methodologies are regularly challenged

• Ability to work under pressure and deliver on tight deadlines


Qualifications and Experience:


• Graduate degree in: Statistics/Economics/Econometrics/Computer

Science/Engineering/Mathematics/MBA (with a strong quantitative background) or

equivalent

• Strong track record work experience in the field of business intelligence, market

research, and/or Advanced Analytics

• Knowledge of data collection methods (focus groups, surveys, etc.)

• Knowledge of statistical packages (SPSS, SAS, R, Python, or similar), databases,

and MS Office (Excel, PowerPoint, Word)

• Strong analytical and critical thinking skills

• Industry experience in Consumer Experience/Healthcare a plus

Read more
Hyderabad
4 - 7 yrs
₹12L - ₹28L / yr
skill iconPython
Spark
Big Data
Hadoop
Apache Hive
Must have :

  • At least 4 to 7 years of relevant experience as Big Data Engineer
  • Hands-on experience in Scala or Python
  • Hands-on experience on major components in Hadoop Ecosystem like HDFS, Map Reduce, Hive, Impala.
  • Strong programming experience in building applications/platform using Scala or Python.
  • Experienced in implementing Spark RDD Transformations, actions to implement business analysis


We are specialized in productizing solutions of new technology. 
Our vision is to build engineers with entrepreneurial and leadership mindsets who can create highly impactful products and solutions using technology to deliver immense value to our clients.
We strive to develop innovation and passion into everything we do, whether it is services or products, or solutions.
Read more
AxionConnect Infosolutions Pvt Ltd
Shweta Sharma
Posted by Shweta Sharma
Pune, Bengaluru (Bangalore), Hyderabad, Nagpur, Chennai
5.5 - 7 yrs
₹20L - ₹25L / yr
skill iconDjango
skill iconFlask
Snowflake
Snow flake schema
SQL
+4 more

Job Location: Hyderabad/Bangalore/ Chennai/Pune/Nagpur

Notice period: Immediate - 15 days

 

1.      Python Developer with Snowflake

 

Job Description :


  1. 5.5+ years of Strong Python Development Experience with Snowflake.
  2. Strong hands of experience with SQL ability to write complex queries.
  3. Strong understanding of how to connect to Snowflake using Python, should be able to handle any type of files
  4.  Development of Data Analysis, Data Processing engines using Python
  5. Good Experience in Data Transformation using Python. 
  6.  Experience in Snowflake data load using Python.
  7.  Experience in creating user-defined functions in Snowflake.
  8.  Snowsql implementation.
  9.  Knowledge of query performance tuning will be added advantage.
  10. Good understanding of Datawarehouse (DWH) concepts.
  11.  Interpret/analyze business requirements & functional specification
  12.  Good to have DBT, FiveTran, and AWS Knowledge.
Read more
Technology service company
Remote only
5 - 10 yrs
₹10L - ₹20L / yr
Relational Database (RDBMS)
NOSQL Databases
NOSQL
Performance tuning
SQL
+10 more

Preferred Education & Experience:

  • Bachelor’s or master’s degree in Computer Engineering, Computer Science, Computer Applications, Mathematics, Statistics or related technical field or equivalent practical experience. Relevant experience of at least 3 years in lieu of above if from a different stream of education.

  • Well-versed in and 5+ years of hands-on demonstrable experience with:
    ▪ Data Analysis & Data Modeling
    ▪ Database Design & Implementation
    ▪ Database Performance Tuning & Optimization
    ▪ PL/pgSQL & SQL

  • 5+ years of hands-on development experience in Relational Database (PostgreSQL/SQL Server/Oracle).

  • 5+ years of hands-on development experience in SQL, PL/PgSQL, including stored procedures, functions, triggers, and views.

  • Hands-on experience with demonstrable working experience in Database Design Principles, SQL Query Optimization Techniques, Index Management, Integrity Checks, Statistics, and Isolation levels

  • Hands-on experience with demonstrable working experience in Database Read & Write Performance Tuning & Optimization.

  • Knowledge and Experience working in Domain Driven Design (DDD) Concepts, Object Oriented Programming System (OOPS) Concepts, Cloud Architecture Concepts, NoSQL Database Concepts are added values

  • Knowledge and working experience in Oil & Gas, Financial, & Automotive Domains is a plus

  • Hands-on development experience in one or more NoSQL data stores such as Cassandra, HBase, MongoDB, DynamoDB, Elastic Search, Neo4J, etc. a plus.

Read more
A Fintech startup in Dubai
Agency job
via Jobbie by Sourav Nandi
Remote, Dubai, Bengaluru (Bangalore), Mumbai
2 - 18 yrs
₹14L - ₹38L / yr
skill iconData Science
skill iconPython
skill iconR Programming
RESPONSIBILITIES AND QUALIFICATIONS The mission of the Marcus Surveillance Analytics team is to deliver a platform which detects security incidents which have a tangible business impact and actionable response. You will work alongside industry leading technologists from who have recently joined Goldman from across consumer security, technology, fintech, finance and quant firms. The role has a broad scope which will involve interacting with senior leaders of Goldman and the Consumer business on a regular basis. The position is hands-on and requires a driven and “take ownership” oriented individual who is intently focused on execution. You will work directly with developers, business leaders, vendors and partners in order to deliver security assets to the consumer business. Develop a team, vision and platform which identifies/prioritizes actionable security & fraud risks which have tangible businesses impact across Goldman's consumer and commercial banking businesses. Develop response and recovery technology and programs to ensure resilience from fraud and abuse events. Manage, develop and operationalize analytics which discover security & fraud events and identifies risks for all of Goldman's consumer businesses. Partner with fraud / abuse operations and leadership to ensure consumer fraud rates are within industry norms and own outcomes related to fraud improvements. Skills And Experience We Are Looking For BA/BS degree in Computer Science, Cybersecurity, or other relevant Computer/Data/Engineering degrees 2+ years of experience as a security professional or data analyst/scientist/engineer Python, PySpark, R, Bash, SQL, Splunk (search, ES, UBA) Experience with cloud infrastructure/big data tool sets Visualization tools such as Tableau or D3 Research and development to create innovative predictive detections for security and fraud Build a 24/7 real-time monitoring system with long term vision for scaling to new lines of consumer businesses Strong focus on customer experience and product usability Ability to work closely with the business, fraud, and security incident response teams on creating actionable detections
Read more
OpexAI
at OpexAI
1 recruiter
Jasmine Shaik
Posted by Jasmine Shaik
Hyderabad
0 - 1 yrs
₹1L - ₹1L / yr
skill iconData Science
skill iconR Programming
skill iconPython
TensorFlow
freshers of Bigdata, Data scientist, Computer vision of their skills
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos