Database Architect

at Grand Hyper

DP
Posted by Rahul Malani
icon
Bengaluru (Bangalore)
icon
5 - 10 yrs
icon
₹10L - ₹20L / yr
icon
Full time
Skills
Data Warehouse (DWH)
Apache Hive
ETL
DWH Cloud
Hadoop
Spark
Mango DB
PostgreSQL
candidate will be responsible for all aspects of data acquisition, data transformation, and analytics scheduling and operationalization to drive high-visibility, cross-division outcomes. Expected deliverables will include the development of Big Data ELT jobs using a mix of technologies, stitching together complex and seemingly unrelated data sets for mass consumption, and automating and scaling analytics into the GRAND's Data Lake. Key Responsibilities : - Create a GRAND Data Lake and Warehouse which pools all the data from different regions and stores of GRAND in GCC - Ensure Source Data Quality Measurement, enrichment and reporting of Data Quality - Manage All ETL and Data Model Update Routines - Integrate new data sources into DWH - Manage DWH Cloud (AWS/AZURE/Google) and Infrastructure Skills Needed : - Very strong in SQL. Demonstrated experience with RDBMS, Unix Shell scripting preferred (e.g., SQL, Postgres, Mongo DB etc) - Experience with UNIX and comfortable working with the shell (bash or KRON preferred) - Good understanding of Data warehousing concepts. Big data systems : Hadoop, NoSQL, HBase, HDFS, MapReduce - Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments. - Working with data delivery teams to set up new Hadoop users. This job includes setting up Linux users, setting up and testing HDFS, Hive, Pig and MapReduce access for the new users. - Cluster maintenance as well as creation and removal of nodes using tools like Ganglia, Nagios, Cloudera Manager Enterprise, and other tools. - Performance tuning of Hadoop clusters and Hadoop MapReduce routines. - Screen Hadoop cluster job performances and capacity planning - Monitor Hadoop cluster connectivity and security - File system management and monitoring. - HDFS support and maintenance. - Collaborating with application teams to install operating system and - Hadoop updates, patches, version upgrades when required. - Defines, develops, documents and maintains Hive based ETL mappings and scripts
Read more

About Grand Hyper

The group currently operates Grand Shopping Malls, Grand Hypermarkets and Grand Xpress in Middle East and India.
Read more
Founded
2017
Type
Products & Services
Size
20-100 employees
Stage
Raised funding
View full company details
Why apply to jobs via Cutshort
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
2101133
Matches delivered
3712187
Network size
15000
Companies hiring

Similar jobs

Java/Scala Lead

at HyrHub

Founded 2015  •  Services  •  employees  •  Bootstrapped
Java
Scala
Spark
icon
Bengaluru (Bangalore)
icon
5 - 11 yrs
icon
₹15L - ₹25L / yr

Job Requirements :

- Define, implement and validate solution frameworks and architecture patterns for data modeling, data integration, processing, reporting, analytics and visualization using leading cloud, big data, open-source and other enterprise technologies.

- Develop scalable data and analytics solutions leveraging standard platforms, frameworks, patterns and full stack development skills.

- Analyze, characterize and understand data sources, participate in design discussions and provide guidance related to database technology best practices.

- Write tested, robust code that can be quickly moved into production

Responsibilities :

- Experience with distributed data processing and management systems.

- Experience with cloud technologies including Spark SQL, Java/ Scala, HDFS, AWS EC2, AWS S3, etc.

- Familiarity with leveraging and modifying open source libraries to build custom frameworks.

Primary Technical Skills :
- Spark SQL, Java/ Scala, Sbt/ Maven/ Gradle, HDFS, Hive, AWS(EC2, S3, SQS, EMR, Glue Scripts, Lambda, Step Functions), IntelliJ IDE, JIRA, Git, Bitbucket/GitLab, Linux, Oozie.

Read more
Job posted by
Shwetha Naik

Tableau Lead / Sr Developer

at A fast-growing SaaS commerce company permanent WFH & Office

Agency job
via Jobdost
SQL
Relational Database (RDBMS)
Amazon Redshift
PostgreSQL
Data Analytics
Data Warehouse (DWH)
Tableau
Scripting language
icon
Bengaluru (Bangalore)
icon
5 - 8 yrs
icon
₹20L - ₹25L / yr

What is the role?

You will be responsible for building and maintaining highly scalable data infrastructure for our cloud-hosted SAAS product. You will work closely with the Product Managers and Technical team to define and implement data pipelines for customer-facing and internal reports.

Key Responsibilities

  • Understanding of the business process and requirements thoroughly and convert them to the reports.
  • Should be able to suggest the right way to the users of the reports.
  • Developing, maintaining, and managing advanced reporting, analytics, dashboards and other BI solutions.

What are we looking for?

An enthusiastic individual with the following skills. Please do not hesitate to apply if you do not match all of it. We are open to promising candidates who are passionate about their work and are team players.

  • Education - BE/MCA or equivalent
  • Good experience in working on the performance side of the reports.
  • Expert level knowlege of querying in any RDBMS and preferrably in Redshift or Postgress
  • Expert level knowledge of Datawarehousing concepts
  • Advanced level scripting to create calculated fields, sets, parameters, etc
  • Degree in mathematics, computer science, information systems, or related field.
  • 5-7 years of exclusive experience Tableau and Dataware house.

Whom will you work with?

You will work with a top-notch tech team, working closely with the CTO and product team.  

What can you look for?

A wholesome opportunity in a fast-paced environment that will enable you to juggle between concepts, yet maintain the quality on content, interact and share your ideas and have loads of learning while at work. Work with a team of highly talented young professionals and enjoy the benefits.

We are

A fast-growing SaaS commerce company based in Bangalore with offices in Delhi, Mumbai, SF, Dubai, Singapore and Dublin. We have three products in our portfolio: Plum, Empuls and Compass. Works with over 1000 global clients. We help our clients in engaging and motivating their employees, sales teams, channel partners or consumers for better business results.

 

Read more
Job posted by
Mamatha A

Data Engineer

at Top startup of India - News App

Agency job
via Jobdost
Linux/Unix
Python
Hadoop
Apache Spark
MongoDB
Data flow
BigQuery
NOSQL Databases
Google Cloud Platform (GCP)
icon
Noida
icon
2 - 5 yrs
icon
₹20L - ₹35L / yr
Responsibilities
● Create and maintain optimal data pipeline architecture.
● Assemble large, complex data sets that meet functional / non-functional
business requirements.
● Building and optimizing ‘big data’ data pipelines, architectures and data sets.
● Maintain, organize & automate data processes for various use cases.
● Identifying trends, doing follow-up analysis, preparing visualizations.
● Creating daily, weekly and monthly reports of product KPIs.
● Create informative, actionable and repeatable reporting that highlights
relevant business trends and opportunities for improvement.

Required Skills And Experience:
● 2-5 years of work experience in data analytics- including analyzing large data sets.
● BTech in Mathematics/Computer Science
● Strong analytical, quantitative and data interpretation skills.
● Hands-on experience with Python, Apache Spark, Hadoop, NoSQL
databases(MongoDB preferred), Linux is a must.
● Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
● Experience with Google Cloud Data Analytics Products such as BigQuery, Dataflow, Dataproc etc. (or similar cloud-based platforms).
● Experience working within a Linux computing environment, and use of
command-line tools including knowledge of shell/Python scripting for
automating common tasks.
● Previous experience working at startups and/or in fast-paced environments.
● Previous experience as a data engineer or in a similar role.
Read more
Job posted by
Sathish Kumar

Production Support Engineer

at A Global Leader in technology Transformation

Agency job
via Jobdost
Java
SQL
RESTful APIs
Webservices
Web applications
PostgreSQL
.NET
icon
Chennai
icon
5 - 8 yrs
icon
₹10L - ₹12L / yr

Mandatory

·  Candidates must have substantial applicable IT experience, extensive development, programming experience in Java/ Dotnet / VB or other Web Application Development

·  Strong experience in DB (PostgreSQL, SQL server) Web Services using Restful, any.

Read more
Job posted by
Sathish Kumar

Senior/Lead Software Engineer - Infrastructure, Cloud

at 6sense

Founded 2013  •  Product  •  1000-5000 employees  •  Raised funding
Java
API
Python
Kubernetes
Docker
HLD
Ansible
Chef
Puppet
Amazon Web Services (AWS)
Hadoop
Apache Hive
icon
Remote only
icon
4 - 10 yrs
icon
Best in industry

The Company:

It’s no surprise that 6sense is named a top workplace year after year — we have industry-leading technology developed and taken to market by a world-class team. 6sense is Top Rated on Glassdoor with a 4.9/5 and our CEO Jason Zintak was recognized as the #1 CEO in the small & medium business category by Glassdoor’s 2021 Top CEO Employees Choice Awards.

In 2021, the company was recognized for having the Best Company for Diversity, Best Company for Women, Best CEO, Best Company Culture, Best Company Perks & Benefits and Happiest Employees from the employee feedback platform Comparably. In addition, 6sense has also won several accolades that demonstrate its reputation as an employer of choice including the Glassdoor Best Place to Work (2022), TrustRadius Tech Cares (2021) and Inc. Best Workplaces (2022, 2021, 2020, 2019).

6sense reinvents the way organizations create, manage, and convert pipeline to revenue. The 6sense Revenue AI captures anonymous buying signals, predicts the right accounts to target at the ideal time, and recommends the channels and messages to boost revenue performance. Removing guesswork, friction and wasted sales effort, 6sense empowers sales, marketing, and customer success teams to significantly improve pipeline quality, accelerate sales velocity, increase conversion rates, and grow revenue predictably.

Senior Software Engineer - Infrastructure, Cloud

Responsibilities:

Develop and deploy services to improve the availability, ease of use/management, and visibility of 6sense systems

Building and scaling out our services and infrastructure

Learning and adopting technologies that may aide in solving our challenges

Own our critical underlying systems like AWS, Kubernetes, Mesos, infrastructure deployment, and compute cluster architecture (which services frameworks and engines like Hadoop/Hive/Presto)

Write/review/debug production code, develop documentation and capacity plans, and debug live production problems Contributing back to open-source projects if we need to add or patch functionality
Support the overall Software Engineering team to resolve any issues they encounter

Minimum Qualifications:

5+ years of experience with Linux/Unix system administration and networking fundamentals 3+ years in a Software Engineering role or equivalent experience
4+ years of working with AWS
4+ years of experience working with Kubernetes, Docker.

Strong skills in reading code as well as writing clean, maintainable, and scalable code
Good knowledge of Python
Experience designing, building, and maintaining scalable services and/or service-oriented architecture
Experience with high-availability
Experience with modern configuration management tools (e.g. Ansible/AWX, Chef, Puppet, Pulumi) and idempotency

Bonus Requirements:

Knowledge of standard security practices
Knowledge of the Hadoop ecosystem (e.g. Hadoop, Hive, Presto) including deployment, scaling, and maintenance Experience with operating and maintaining VPN/SSH/ZeroTrust access infrastructure
Experience with CDNs such as CloudFront and Akamai
Good knowledge of Javascript, Java, Golang
Exposure to modern build systems such as Bazel, Buck, or Pants#LI-remote

Every person in every role at 6sense owns a part of defining the future of our industry-leading technology. You’ll join a team where curiosity is prized, no one’s satisfied with the status quo, and everyone’s all-in on the collective good.6sense is a place where difference-makers roll up their sleeves, take risks, act with integrity, and measure successby the value we create for our customers.

We want 6sense to be the best chapter of your career.

Feel part of something

You’ll be part of building tomorrow’s tech, revolutionizing how marketing and sales teams create, manage, and convert pipeline to revenue. And you’ll be seen and appreciated by co-workers who challenge you, cheer you on, and always have your back.

At 6sense, you’ll experience the passion from customers and colleagues alike for our market-leading vision, and you're entrusted with applying your unique talents to help bring that vision to life.

Build a career

As part of a company on a rocketship trajectory, there’s no way around it: You’re going to experience unparalleled career growth. With colleagues as humble and hungry as you are, and a leadership philosophy grounded in trust, transparency, and empowerment, every day is a chance to improve on the one before.

Enjoy access to our Udemy Training Library with 5,000+ courses, give and get recognition from your coworkers, and spend time with our executive team every two weeks in our All Hands gathering to connect, learn and ask leaders about whatever is on your mind.

Enjoy work, and your life

This is a place where you’ll do your best work and inspire others to do theirs — where you’re guaranteed to make real connections, for life, along the way.

We want to help you prioritize health and wellness, today and tomorrow. Take advantage of family medical coverage; a monthly stipend to support your physical, mental, and financial wellness; generous paid parental leave benefits; Plus, we have an open time-off policy, so you can take the time you need.

Set for success 

A vision as big as ours only comes to life when we’re all winning together.

We’ll make sure you have the equipment you need to work at home or in one of our offices. And have the right snacks, pens or lighting with our work-from-home expense reimbursement allowance. We also partner with WeWork to make sure that if your choice is a hybrid of home and office, we have you covered in the locations they’re offered.

That’s the commitment we make to every one of our employees. If this sounds like a place where you'll thrive as you take your success to the next level, let’s chat!

Read more
Job posted by
Kunjan Bhagat
Team Management
Java
Hadoop
Microservices
People Management
Amazon Web Services (AWS)
icon
Bengaluru (Bangalore)
icon
7 - 13 yrs
icon
₹10L - ₹12L / yr

Senior Team Lead, Software Engineering (96386)

 

Role: Senior Team Lead


Skills:  Has to be an expert in these -               

  1. Java
  2. Microservices
  3. Hadoop
  4. People Management Skills.

                   
Will be a plus if knowledge on -            

AWS

Location:    Bangalore India – North Gate.

 

Read more
Job posted by
Rijooshri Saikia

Product Support Engineer

at Fynd

Founded 2012  •  Product  •  100-500 employees  •  Raised funding
Product support
Shell Scripting
Technical support
Scripting language
Linux/Unix
MySQL
PostgreSQL
Production support
icon
Mumbai, Noida, Bengaluru (Bangalore)
icon
2 - 7 yrs
icon
₹8L - ₹18L / yr
In this role you would be responsible for day to day maintenance of engineering systems. You would also often act as first line of support for internal applications while fixing bugs, developing and deploying small components of code.

The role of the Production Support Engineer is a technical role thats ensures the stability, integrity, and efficient operation of the platform / system and services.

High Impact production issues often require coordination between multiple Engineering, Infrastructure and Product groups, so you get to experience a breadth of impact with various groups.

Responsibilities

  • Logging and keeping records of various issues to help the team prioritize fixes and automations, along with measuring the product quality.
  • Documenting troubleshooting and problem resolution steps.
  • Monitor alerting channels, analyze problems, diagnose and do occasional code fixes with low to medium complexity.
  • Taking ownership of technical issues and working closely with developers to resolve more complicated problems.
  • Work closely with product and developers to enhance the quality of existing products.
  • Resolving escalated customer complaints without the need for team lead intervention.
  • Address urgent issues quickly, work within and measure against customer SLAs.
  • Write scripts (shell, python, ruby, php) and aggressively automate manual / repetitive tasks.
  • Automate scripts / tasks for reporting and maintenance; and build anomaly detectors and alerting where ever applicable.
  • Develop smaller complexity features/enhancements in existing products.
  • Perform in-depth research and identify sources of production issues surrounding the application.
  • Work closely with business in managing day to day issues, resolve user queries.
  • Perform daily health checks of the application, job schedules and infrastructure supporting the application.
  • Develop and facilitate monitoring systems to identify issues before they happen.
  • Identify, develop and design features to solve pattern of problems to stabilize production systems.
  • Create accurate DB queries that will identify affected data and rectify them.
  • Build a deep understanding of the domain.

Requirements

  • Knowledge of Unix / Linux based systems.
  • Experience working with MySQL and Postgresql and writing simple queries to get data for debugging issues.
  • Being able to creatively come up with solutions for various problems and implement them.
  • An in-depth understanding of the different products and ability to navigate through the code to debug and small fixes.
  • Hands on with any of the scripting languages like Bash, Python, PHP, Ruby.
  • Excellent analytical and logical thinking.
  • Quick troubleshooting and diagnosing skills.
  • Problem solving and debugging skills.
  • Ability to join the dots around multiple events occurring concurrently and spot patterns.

Good to have requirements

  • Prior production support experience.
  • Prior programming experience.
  • Familiarity with Grafana, Prometheus, Elasticsearch, Sentry
  • Experience in dealing with RESTful web services is a plus.
  • Worked with microservices architecture
Read more
Job posted by
Akshata Kadam
Data Warehouse (DWH)
ETL
ADF
Business Intelligence (BI)
Data architecture
SQL Azure
Azure Databricks
icon
Gurgaon/Gurugram, Delhi, Gurugram, Noida, Ghaziabad, Faridabad
icon
7 - 15 yrs
icon
₹25L - ₹45L / yr
Responsibilities

* Formulates and recommends standards for achieving maximum performance

and efficiency of the DW ecosystem.

* Participates in the Pre-sales activities for solutions of various customer

problem-statement/situations.

* Develop business cases and ROI for the customer/clients.

* Interview stakeholders and develop BI roadmap for success given project

prioritization

* Evangelize self-service BI and visual discovery while helping to automate any

manual process at the client site.

* Work closely with the Engineering Manager to ensure prioritization of

customer deliverables.

* Champion data quality, integrity, and reliability throughout the organization by

designing and promoting best practices.

 *Implementation 20%

* Help DW/DE team members with issues needing technical expertise or

complex systems and/or programming knowledge.

* Provide on-the-job training for new or less experienced team members.

* Develop a technical excellence team

Requirements

- experience designing business intelligence solutions

- experience with ETL Process, Data warehouse architecture

- experience with Azure Data services i.e., ADF, ADLS Gen 2, Azure SQL dB,

Synapse, Azure Databricks, and Power BI

- Good analytical and problem-solving skills

- Fluent in relational database concepts and flat file processing concepts

- Must be knowledgeable in software development lifecycles/methodologies
Read more
Job posted by
Vikas Shelke

Database Developer

at Digital, a leading provider of omnichannel payment. (PC1)

Agency job
via Multi Recruit
Oracle
PostgreSQL
SQL
PLSQL
Performance tuning
Plsql
Database Management
icon
Pune
icon
3 - 8 yrs
icon
₹12L - ₹16L / yr
  • Minimum 3 years of working experience in database related projects.
  • Working experience on any of the leading Database management systems viz., Oracle,SQL-Server, etc. (Experience on Postgres database systems will be an added advantage).
  • Adequate Database Development skills & should be proficient in SQL coding.
  • Working experience in Database SQL Optimization/Performance Tuning.
  • Working experience on any of the Database Programming languages viz., PL/SQL, PL/pgSQL,T-SQL,etc.
  • Flexible to learn, adopt & scale to any data centric technologies as per the evolving Organizational product strategies.

Ability to troubleshoot Database related issues.

  • Good to have skill sets of the Candidate:
  • Working experience on Data Warehouse projects or exhibits sound knowledge over the concepts.
  • Working experience on Data Analytics/Reporting/ETL or exhibits sound knowledge over the concepts.
  • Database agnostic approach to data/database solutions.
  • Exhibits sound knowledge on Big-data/Hadoop, Kafka, Map-Reduce programming, NoSQL,etc.
Read more
Job posted by
Sapna Deb

Data Engineer

at Big revolution in the e-gaming industry. (GK1)

Agency job
via Multi Recruit
Python
Scala
Hadoop
Spark
Data Engineer
Kafka
Luigi
Airflow
Nosql
icon
Bengaluru (Bangalore)
icon
2 - 3 yrs
icon
₹15L - ₹20L / yr
  • We are looking for a Data Engineer to build the next-generation mobile applications for our world-class fintech product.
  • The candidate will be responsible for expanding and optimising our data and data pipeline architecture, as well as optimising data flow and collection for cross-functional teams.
  • The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimising data systems and building them from the ground up.
  • Looking for a person with a strong ability to analyse and provide valuable insights to the product and business team to solve daily business problems.
  • You should be able to work in a high-volume environment, have outstanding planning and organisational skills.

 

Qualifications for Data Engineer

 

  • Working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
  • Experience building and optimising ‘big data’ data pipelines, architectures, and data sets.
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Strong analytic skills related to working with unstructured datasets. Build processes supporting data transformation, data structures, metadata, dependency and workload management.
  • Experience supporting and working with cross-functional teams in a dynamic environment.
  • Looking for a candidate with 2-3 years of experience in a Data Engineer role, who is a CS graduate or has an equivalent experience.

 

What we're looking for?

 

  • Experience with big data tools: Hadoop, Spark, Kafka and other alternate tools.
  • Experience with relational SQL and NoSQL databases, including MySql/Postgres and Mongodb.
  • Experience with data pipeline and workflow management tools: Luigi, Airflow.
  • Experience with AWS cloud services: EC2, EMR, RDS, Redshift.
  • Experience with stream-processing systems: Storm, Spark-Streaming.
  • Experience with object-oriented/object function scripting languages: Python, Java, Scala.
Read more
Job posted by
Ayub Pasha
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
Get to hear about interesting companies hiring right now
iconFollow Cutshort
Want to apply to this role at Grand Hyper?
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Learn more
Get to hear about interesting companies hiring right now
iconFollow Cutshort