Cutshort logo

11+ SQR Jobs in India

Apply to 11+ SQR Jobs on CutShort.io. Find your next job, effortlessly. Browse SQR Jobs and apply today!

icon
ITC Infotech
Agency job
via Kadbit solutions by Veeresh Kumar TJ
Bengaluru (Bangalore)
4 - 12 yrs
₹6L - ₹16L / yr
Peoplesoft
Enterprise Resource Planning (ERP)
PeopleCode
Application engine
Human Resource Management System (HRMS)
+3 more

Peoplesoft Technical - SSE

Sr.Software Engineer with 5 to 9 years of experience in PeopleSoft

 

Mandatory:

Good knowledge and minimum 5 years hands-on Technical experience on Application Designer, PeopleCode, Application Engine, Component Interface, Application Package and Integration Broker.

Good Verbal & Written communication Skills

 

Preferred:

Sound techno-functional and business process conceptual knowledge on Peoplesoft Finance procure to pay modules - AP,GL,PC, AM, PO, ePro, SCMT, Catalog Management & eSettlements.

Experience on reporting tools like BI Publisher, SQR, PS Query, PS Security, and nVision reports will be added advantage.

Worked in Upgrade/ Support/Development projects in PeopleSoft Financials.

Worked on all phases of a project - design, development, testing etc.

Knowledge in mainframes would be an added advantage

Read more
Top 3 Fintech Startup
Agency job
via Jobdost by Sathish Kumar
Bengaluru (Bangalore)
6 - 9 yrs
₹16L - ₹24L / yr
SQL
skill iconAmazon Web Services (AWS)
Spark
PySpark
Apache Hive

We are looking for an exceptionally talented Lead data engineer who has exposure in implementing AWS services to build data pipelines, api integration and designing data warehouse. Candidate with both hands-on and leadership capabilities will be ideal for this position.

 

Qualification: At least a bachelor’s degree in Science, Engineering, Applied Mathematics. Preferred Masters degree

 

Job Responsibilities:

• Total 6+ years of experience as a Data Engineer and 2+ years of experience in managing a team

• Have minimum 3 years of AWS Cloud experience.

• Well versed in languages such as Python, PySpark, SQL, NodeJS etc

• Has extensive experience in the real-timeSpark ecosystem and has worked on both real time and batch processing

• Have experience in AWS Glue, EMR, DMS, Lambda, S3, DynamoDB, Step functions, Airflow, RDS, Aurora etc.

• Experience with modern Database systems such as Redshift, Presto, Hive etc.

• Worked on building data lakes in the past on S3 or Apache Hudi

• Solid understanding of Data Warehousing Concepts

• Good to have experience on tools such as Kafka or Kinesis

• Good to have AWS Developer Associate or Solutions Architect Associate Certification

• Have experience in managing a team

Read more
Amagi Media Labs

at Amagi Media Labs

3 recruiters
Rajesh C
Posted by Rajesh C
Chennai
5 - 7 yrs
₹20L - ₹30L / yr
Technical support
Tech Support
SQL
Informatica
PySpark
+1 more
Job Title: Support Engineer L3 Job Location: Chennai
We CondéNast are looking for a Support engineer Level 2 who would be responsible for
monitoring and maintaining the production systems to ensure the business continuity is
maintained. Your Responsibilities would also include prompt communication to business
and internal teams about process delays, stability, issue, resolutions.
Primary Responsibilities
● 5+ years experience in Production support
● The Support Data Engineer is responsible for monitoring of the data pipelines
that are in production.
● Level 3 support activities - Analysing issue, debug programs & Jobs, bug fix
● The position will contribute to the monitoring, rerun or reschedule, code fix
of pipelines for a variety of projects on a daily basis.
● Escalate failures to Data-Team/DevOps incase of Infrastructure Failures or unable
to revive the data-pipelines.
● Ensure accurate alerts are raised incase of pipeline failures and corresponding
stakeholders (Business/Data Teams) are notified about the same within the
agreed upon SLAs.
● Prepare and present success/failure metrics by accurately logging the
monitoring stats.
● Able to work in shifts to provide overlap with US Business teams
● Other duties as requested or assigned.
Desired Skills & Qualification
● Have Strong working knowledge of Pyspark, Informatica, SQL(PRESTO), Batch
Handling through schedulers(databricks, Astronomer will be an
advantage),AWS-S3, SQL, Airflow and Hive/Presto
● Have basic knowledge on Shell scripts and/or Bash commands.
● Able to execute queries in Databases and produce outputs.
● Able to understand and execute the steps provided by Data-Team to
revive data-pipelines.
● Strong verbal, written communication skills and strong interpersonal
skills.
● Graduate/Diploma in computer science or information technology.
About Condé Nast
CONDÉ NAST GLOBAL
Condé Nast is a global media house with over a century of distinguished publishing
history. With a portfolio of iconic brands like Vogue, GQ, Vanity Fair, The New Yorker and
Bon Appétit, we at Condé Nast aim to tell powerful, compelling stories of communities,
culture and the contemporary world. Our operations are headquartered in New York and
London, with colleagues and collaborators in 32 markets across the world, including
France, Germany, India, China, Japan, Spain, Italy, Russia, Mexico, and Latin America.
Condé Nast has been raising the industry standards and setting records for excellence in
the publishing space. Today, our brands reach over 1 billion people in print, online, video,
and social media.
CONDÉ NAST INDIA (DATA)
Over the years, Condé Nast successfully expanded and diversified into digital, TV, and
social platforms - in other words, a staggering amount of user data. Condé Nast made the
right move to invest heavily in understanding this data and formed a whole new Data
team entirely dedicated to data processing, engineering, analytics, and visualization. This
team helps drive engagement, fuel process innovation, further content enrichment, and
increase market revenue. The Data team aimed to create a company culture where data
was the common language and facilitate an environment where insights shared in
real-time could improve performance. The Global Data team operates out of Los Angeles,
New York, Chennai, and London. The team at Condé Nast Chennai works extensively with
data to amplify its brands' digital capabilities and boost online revenue. We are broadly
divided into four groups, Data Intelligence, Data Engineering, Data Science, and
Operations (including Product and Marketing Ops, Client Services) along with Data
Strategy and monetization. The teams built capabilities and products to create
data-driven solutions for better audience engagement.
What we look forward to:
We want to welcome bright, new minds into our midst and work together to create
diverse forms of self-expression. At Condé Nast, we encourage the imaginative and
celebrate the extraordinary. We are a media company for the future, with a remarkable
past. We are Condé Nast, and It Starts Here.
Read more
Remote only
5 - 10 yrs
₹30L - ₹38L / yr
skill iconReact.js
skill iconNodeJS (Node.js)
skill iconHTML/CSS
skill iconJavascript
SQL
+2 more
What you will do
● Operate as a key senior team member within the Engineering team driving execution strategies and implementation of product roadmap
● Lead one or more squads of engineers (front-end and back-end) - hiring, managing them as well as 1:1 coaching and mentoring them to advance in their careers
● Lead work in tandem with Product Squads (including PM, Designers, Analysts) and other internal stakeholders to strategize on and build world-class product/s
● Develop scope and objectives of projects and sprints based on technical feasibility, keeping relevant stakeholders in the loop, and ensuring engineering team delivers consistently on the same
● Ensure engineering bandwidth based on availability and allocation, and develop and maintain detailed project plan to track progress and set expectations with engineering squad/s
● Perform regular risk management to evaluate and minimize project risks
● Manage own time and bandwidth with being hands-on as required on a regular basis.
● Performing research and finding opportunities to utilize development best practices, forming guidelines to improve system productivity, and working on scaling and monitoring.
● Scale technology architecture and team to drive short and long term product growth
● Be a key engineering voice for Virtual Internships in the external world, showcasing the world-class capabilities of the engineering team through blog, posts, events and more

REQUIREMENTS
Must haves
● Overall 4 - 8 years of product engineering experience (ideally full stack)
● 1+ years of experience of managed engineering teams that have a strong record of developing and delivering world-class products
● Experience leading teams/working in an EPD/Squad structure working closely with PM, Designers, Analysts, Engineers and QA
● Deep technical skills and expertise but lean more on leadership skills to get the job done
● Past experience with MERN/MEAN Stack along with good exposure building scalable and distributed systems using microservices
● Good grasp and understanding of technical skills (at least a few):
 Front-end - HTML, CSS, React.js, Webpack, Redux, Javascript, Enzyme
 Back-end - Node.js, SQL, Redis, RestAPI

● Strong communication and inter-personal skills, especially to streamline flow of information between Engineering and other teams/departments, as well as strong conflict resolution skills
● Good at creative problem solving and critical thinking to resolve issues quickly for Engineers and other teams

Good to have
● Bachelor’s degree in computer science, information technology, or a similar field
● Previous experience on recruiting engineers and coaching/mentoring them
● Experience with high and low level software development concepts such as Scaling, OOP, Loadn Balancers, Storage, Application server protocols like HTTP, Websocket, etc.
● Experience working with complex data structures and data architecture

Bonus ‘to-haves’
● Experience with a (any) set of incredible product development and task management tools (such as ClickUp, JIRA, etc)
Read more
hRINPUTS
RAHUL BATTA
Posted by RAHUL BATTA
Bengaluru (Bangalore)
10 - 15 yrs
₹20L - ₹35L / yr
success factors
Systems Development Life Cycle (SDLC)
HR management system administration
Human Resource Management System (HRMS)
SAP
+1 more

Sr. HR Applications Architect

 

Job Title – Technical Solution Architect/ Sr HR Applications Architect

Team – GIS

Role Type – Individual Contributor

Key relationships –

HR leads of various verticals

Technology and implementation teams

 

You will be supported by your peers and experts across many fields who will help you succeed.

 

Job Responsibilities:

COMPANY HR-Applications team is looking for a passionate, engaging Sr HR Applications Technical Architect to join our growing team. This role will perform Technology evaluation, Identification, Solution Design, Execute the design for entire stack of HR-Applications echo-system and perform Technical Production Support.

 

Designs, develops, modifies, debugs and evaluates programs for functional areas, including but not limited to finance, human resources, manufacturing and marketing. Analyzes existing programs or formulates logic for new systems, devises logic procedures, prepares flowcharting, performs coding and tests/debugs programs. Develops conversion and system implementation plans. Prepares and obtains approval of system and programming documentation. Recommends changes in development, maintenance and system standards. Trains users in conversion and implementation of system. May be internal or external, client-focused, working in conjunction with Professional Services and outsourcing functions. May include company-wide, web-enabled solutions.

 

Role Purpose:

Lead design and implementation of the HR systems of the organization mainly in SAP SuccessFactors and Cornerstone on Demand

Interface with business stakeholders, assess feasibility of the requirements and guide the Technology Leads and Implementation teams to align the solution development

Front-run the LMS migration initiative from SAP SuccessFactors to Cornerstone on Demand ensuring a scalable solution to accommodate future enhancements and adoption to all BU’s of Company

Explore new technologies and practices, be a part of the core team building an HR COE and define the standards and best practices

Act as a SPOC/L3 for the current product support related activities and the Learning HR-Echo System

Cross- training teams on knowledge transfer across business functions

 

Qualifications & Experience:

Excellent grasp of one or more HR systems, preferably SuccessFactors and Cornerstone on Demand

Proven experience leading System Integrations, Data Migrations, Implementations, Assessments and Process Improvements on technical stack

10+ years of experience as an HR Enterprise Architect

Knowledge of Learning Management Systems (LMS) is desired

Experience working in a Global Production support model

 

Read more
Bloom infotech

at Bloom infotech

1 recruiter
Ritu Kohli
Posted by Ritu Kohli
Chandigarh
5 - 6 yrs
₹6L - ₹8L / yr
HL7
skill iconJavascript
SQL
Visual Basic (VB)
Healthcare
+1 more

The position will coordinate closely with the head of engineering, the architect, and product managers to build a new healthcare software platform, Document customer meetings well and assess new requirements, build and manage relationships with key personnel, successful project delivery.
5 - 7 years of Healthcare Industry interoperability knowledge (HL7 and FHIR) for Medical Record sharing between Healthcare Organizations
Excellent familiarity with Interoperability Healthcare Standards Organizations and Initiatives
Excellent knowledge of FHIR specifications
Experience building HL7
Read more
Techugo
Agency job
via Smpro by Abhay Kumar
Noida
2 - 7 yrs
₹12L - ₹18L / yr
skill iconMongoDB
SQL
DBA

Job description

About the role

The role encompasses administration of and responsible for MongoDB database and will be responsible for ensuring the database performance, high availability, and security of clusters in MongoDB instances.

  • The candidate will be responsible for ensuring that database management policies, processes and procedures are followed, adhering to ITIL good practice principles and are subjected to continuous improvement as per PCI standards.
  • He / She will be responsible for reviewing system design changes to ensure they adhere to expected service standards and recommend changes to ensure maximum stability, availability and efficiency of the supported applications.
  • The candidate should understand the application functionality, business logic and work with application stakeholders to understand the requirement and discuss the new application features and propose the right solutions.


What you'll do
  • Install, deploy and manage MongoDB on physical and virtual machines
  • Create, configure and monitor large-scale, secure, MongoDB sharded clusters
  • Support MongoDB in a high availability, multi-datacenter environment
  • Administer MongoDB Ops Manager monitoring, backups and automation
  • Configure and monitor numerous MongoDB instances and replica sets
  • Automate routine tasks with your own scripts and open-source tools
  • Improve database backups and test recoverability regularly
  • Study the database needs of our applications and optimize them using MongoDB
  • Maintain database performance and capacity planning
  • Write documentation and collaborate with technical peers online
  • All database administration tasks like backup, restore, SQL optimizations, provisioning infrastructure, setting up graphing, monitoring and alerting tools, replication
  • Performance tuning for high throughput
  • Architecting high availability servers


What qualifications will you need to be successful?

Skills and Qualifications

  • Minimum 1 years of experience in MongoDB technologies, Total should be 3 years in database administration.
  • Install, Deploy and Manage MongoDB on Physical, Virtual, AWS EC2 instances
  • Should have experience on MongoDB Active Active sharded cluster setup with high availability
  • Should have experience on administrating MongoDB on Linux platform
  • Experience on MongoDB version upgrade, preferably from version 4.0 to 4.4, on production environment with a zero or very minimum application down time, either with ops manager or custom script
  • Experience on building the database monitoring using tools like, AppD, ELK, Grafana etc.
  • Experience in Database performance tuning which include both script tuning and hardware configuration and capacity planning.
  • Good Understanding and experience with Mongodb sharding and Disaster Recovery plan
  • Design and implement the backup strategy and BCP process across the MongoDB environments. Maintain the uniform backup strategy across the platform
  • Define the database monitoring, monitoring thresholds, alerts, validate the notifications and maintain the documents for the future references
  • Database performance tuning based on the application requirement and maintain the stable environment. Analyse the existing mongodb queries behalf of the performance improvement program
  • Work with engineering team to understand the database requirement and guide them the best practice and optimize the queries to get the better performance
  • Work with application stake holders to understand the production requirement and propose the effective database solutions
  • Review and understand the ongoing business reports and create new adhoc reports based on the requirement
Read more
Saras Analytics Private Limited
Bhavani Thanga
Posted by Bhavani Thanga
Hyderabad
4 - 10 yrs
₹6L - ₹10L / yr
Technical Writing
Technical Writer
RESTful APIs
Databases
SQL

Job Description

Role- Technical Writer – SaaS Product

 

About Saras Analytics 

 

We are a passionate group of engineers, analysts, data scientists, and domain experts building products and offering services to accelerate the adoption of data as a strategic asset.

 

Our Products

 

Daton is our ETL platform for analysts and developers. Daton replicates data from SaaS platforms such as Google Analytics, Salesforce, Amazon, and Facebook Ads to cloud data warehouses like Amazon Redshift, Google BigQuery, and Snowflake. Daton consolidates data from a variety of data sources into a data warehouse within minutes and allows analysts to focus on generating insights rather than worrying about building and maintaining a data pipeline.


 

Halo is our eCommerce focused Business Intelligence and Analytics platform. Halo helps eCommerce businesses save time and money by automating the replication, transformation, preparation, and quality control of data for reporting and analysis. Halo transforms silos of eCommerce data into actionable data sets and useful reports that help you grow your business. Visit https://sarasanalytics.com/" target="_blank">https://sarasanalytics.com/

 


 

Responsibilities:

·         Responsible for developing technical documentation detailing product features

·         Responsible for creating product manuals, detailed how-to guides and FAQs

·         Responsible for developing and maintaining technical documentation for both external usage and internal guidance across various teams.

·         Responsible for setting up new processes and improve existing technical documentation process

·         Responsible for converting product support questions into how to guides

·         You would be working closely with product managers, product support and engineering teams to gather insights to document

 

Eligibility:

-  2-6 years of experience as a Technical Writer and in product documentation.

-  Familiarity with third-party API integration analysis.

-  Excellent logical, analytical and communication skills to interact with the Development team.

 

 

Requirements

 

 

·         2-6 years of experience as a Technical writer in SAAS products

·         Engineering background in IT or Computer Science

·         Excellent written and oral communication skills

·         Familiarity with APIs, databases, and SQL

·         Hands on experience with documentation and automation tools

·         Hands on experience with version control tools

·         Ability to grasp complex technical concept and make them easily understandable through documentation while paying attention to detail

·         Prior work experience in E-Commerce domain is a plus

Read more
EASEBUZZ

at EASEBUZZ

1 recruiter
Amala Baby
Posted by Amala Baby
Pune
2 - 4 yrs
₹2L - ₹20L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+12 more

Company Profile:

 

Easebuzz is a payment solutions (fintech organisation) company which enables online merchants to accept, process and disburse payments through developer friendly APIs. We are focusing on building plug n play products including the payment infrastructure to solve complete business problems. Definitely a wonderful place where all the actions related to payments, lending, subscription, eKYC is happening at the same time.

 

We have been consistently profitable and are constantly developing new innovative products, as a result, we are able to grow 4x over the past year alone. We are well capitalised and have recently closed a fundraise of $4M in March, 2021 from prominent VC firms and angel investors. The company is based out of Pune and has a total strength of 180 employees. Easebuzz’s corporate culture is tied into the vision of building a workplace which breeds open communication and minimal bureaucracy. An equal opportunity employer, we welcome and encourage diversity in the workplace. One thing you can be sure of is that you will be surrounded by colleagues who are committed to helping each other grow.

 

Easebuzz Pvt. Ltd. has its presence in Pune, Bangalore, Gurugram.

 


Salary: As per company standards.

 

Designation: Data Engineering

 

Location: Pune

 

Experience with ETL, Data Modeling, and Data Architecture

Design, build and operationalize large scale enterprise data solutions and applications using one or more of AWS data and analytics services in combination with 3rd parties
- Spark, EMR, DynamoDB, RedShift, Kinesis, Lambda, Glue.

Experience with AWS cloud data lake for development of real-time or near real-time use cases

Experience with messaging systems such as Kafka/Kinesis for real time data ingestion and processing

Build data pipeline frameworks to automate high-volume and real-time data delivery

Create prototypes and proof-of-concepts for iterative development.

Experience with NoSQL databases, such as DynamoDB, MongoDB etc

Create and maintain optimal data pipeline architecture,

Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.


Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.

Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.

Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.

Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.

Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.

Evangelize a very high standard of quality, reliability and performance for data models and algorithms that can be streamlined into the engineering and sciences workflow

Build and enhance data pipeline architecture by designing and implementing data ingestion solutions.

 

Employment Type

Full-time

 

Read more
European MNC
Agency job
via Kavayah People Consulting by Kavita Singh
Pune
3 - 8 yrs
₹8L - ₹15L / yr
ETL
Data Warehouse (DWH)
SQL
Technical support
The Support Engineer (L2) will serve as a technical support champion for both internal and external customers. 
Key Responsibilities
Support mission critical applications and technologies
Adhere to agreed SLAs.
 
Required Experience, Skills and Qualifications
3-8 years of relevant experience
Proven track record of supporting ETL/Data Warehouse/Business Intelligence solutions
Strong SQL / Unix skills
Excellent written and verbal communication
High-degree of analytical and problem solving skills
Exposure to handling customers from various geographies
Strong debugging and troubleshooting skills
Ability to work with minimum supervision
Team player who shares ideas and resources
Tools and Technologies
ETL Tools: Talend or Informatica experience
BI Tools: Experience supporting Tableau or Jaspersoft or Pentaho or Qlikview
Database: Experience in Oracle or any RDBMS
Read more
Wolken Software

at Wolken Software

4 recruiters
Kavya Hegde
Posted by Kavya Hegde
Remote, Bengaluru (Bangalore)
1 - 3 yrs
₹2.5L - ₹5.5L / yr
skill iconJava
MySQL
SQL server
Product support
SQL
+2 more

Product Support Engineer (L1 Support) – Immediate Requirement.

Key skills

  • 1-3 years exp in Product Support Role.
  • Need to Understand the product and features, to demonstrate the end users.
  • Sound knowledge on Java and SQL
  • Need to Analyze and Resolve the Product specific queries, MySQL.
  • Consistently Deliver the customer requirements.
  • Engage with Development team and manage the progress of cases.
  • Should be able to cope up with high pressure work environment.
  • Work model will be 24X7.

 

Must require

  • Good Communication both verbal and written
  • Analytical skills
  • Team player
  • Preferred male applicants only
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort