Cutshort logo
Wallero technologies logo
Data Engineer - R &R Shiny
Data Engineer - R &R Shiny
Wallero technologies's logo

Data Engineer - R &R Shiny

Nikitha Muthuswamy's profile picture
Posted by Nikitha Muthuswamy
6 - 8 yrs
₹15L - ₹15L / yr
Full time
Hyderabad
Skills
R Language
R Programming
R SHINY
ETL
databricks
  • R Programming: A strong understanding of R programming is essential for developing and maintaining RShiny applications.
  • Shiny Framework: Proficiency in the Shiny framework, including creating interactive web applications and understanding reactive programming concepts.
  • HTML/CSS/JavaScript: Basic knowledge of front-end technologies to customize and style Shiny apps.
  • Data Visualization: Ability to create informative and visually appealing charts and plots using packages like ggplot2 and Plotly.
  • R Package Management: Familiarity with tools like renv or packrat for managing package dependencies within R projects.
  • ETL (Extract, Transform, Load): Knowledge of data extraction, transformation, and loading processes using Databricks.
  • Debugging: Troubleshooting skills to identify and resolve issues in RShiny apps and Databricks pipelines.
  • Documentation: Documenting code, configurations, and processes to ensure knowledge sharing and ease of maintenance.
  • Training: Ability to conduct training on the above when needed.


Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About Wallero technologies

Founded :
2007
Type
Size :
100-1000
Stage :
Profitable
About

Wallero is a global leader in providing business solutions, IT services, consulting; with a large network of innovation & delivery centers. Know more!

Read more
Connect with the team
Profile picture
Nikitha Muthuswamy
Profile picture
Satya Gopaal
Profile picture
Priya Karunakaran
Profile picture
Abilash Perumandla
Profile picture
Keerthana M
Company social profiles
linkedinfacebook
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring

Similar jobs

codersbrain
at codersbrain
1 recruiter
Tanuj Uppal
Posted by Tanuj Uppal
Remote only
5 - 8 yrs
₹9L - ₹14L / yr
ETL
Informatica
Data Warehouse (DWH)
API
Microsoft Windows Azure
+2 more
Responsible for the design of solutions using Azure Data Integration Services. This will include
 
Solution design using Microsoft Azure services and related tools.
Design of enterprise data models and Data Warehouse solutions.
Specification of ETL pipelines, data integration and data migration design.
Design & implementation of Master data management solutions.
Specification of Data Quality Management methodology and supporting technology tools.
Working within a project management/agile delivery methodology in a leading role as part of a wider team.
Deal with other stakeholders/ end users in the software development lifecycle – Scrum Masters, Product Owners, Data Architects, and testing teams.
 
Specific Technologies within the Microsoft Azure Cloud stack include:
 
API Development and APIM registration of REST/SOAP API's.
Azure Service Bus Messaging and Subscribing solutions.
Azure Databricks, Azure Cosmos DB, Azure Data Factory, Azure Logic Apps, Azure Functions.
Azure Storage, Azure SQL Data Warehouse/Synapse, Azure Data Lake.
Read more
Emids Technologies
at Emids Technologies
2 candid answers
Rima Mishra
Posted by Rima Mishra
Bengaluru (Bangalore)
5 - 10 yrs
₹4L - ₹18L / yr
Jasper
JasperReports
ETL
JasperSoft
OLAP
+3 more

Job Description - Jasper 

  • Knowledge of Jasper report server administration, installation and configuration
  • Knowledge of report deployment and configuration
  • Knowledge of Jaspersoft Architecture and Deployment
  • Knowledge of User Management in Jaspersoft Server
  • Experience in developing Complex Reports using Jaspersoft Server and Jaspersoft Studio
  • Understand the Overall architecture of Jaspersoft BI
  • Experience in creating Ad Hoc Reports, OLAP, Views, Domains
  • Experience in report server (Jaspersoft) integration with web application
  • Experience in JasperReports Server web services API and Jaspersoft Visualise.JS Web service API
  • Experience in creating dashboards with visualizations
  • Experience in security and auditing, metadata layer
  • Experience in Interacting with stakeholders for requirement gathering and Analysis
  • Good knowledge ETL design and development, logical and physical data modeling (relational and dimensional)
  • Strong self- initiative to strive for both personal & technical excellence.
  • Coordinate efforts across Product development team and Business Analyst team.
  • Strong business and data analysis skills.
  • Domain knowledge of Healthcare an advantage.
  • Should be strong on Co- ordinate with onshore resources on development.
  • Data oriented professional with good communications skills and should have a great eye for detail.
  • Interpret data, analyze results and provide insightful inferences
  • Maintain relationship with Business Intelligence stakeholders
  • Strong Analytical and Problem Solving skills 


Read more
SteelEye
at SteelEye
1 video
3 recruiters
Agency job
via wrackle by Naveen Taalanki
Bengaluru (Bangalore)
1 - 8 yrs
₹10L - ₹40L / yr
Python
ETL
Jenkins
CI/CD
pandas
+6 more
Roles & Responsibilties
Expectations of the role
This role will be reporting into Technical Lead (Support). You will be expected to resolve bugs in the platform that are identified by Customers and Internal Teams. This role will progress towards SDE-2 in 12-15 months where the developer will be working on solving complex problems around scale and building out new features.
 
What will you do?
  • Fix issues with plugins for our Python-based ETL pipelines
  • Help with automation of standard workflow
  • Deliver Python microservices for provisioning and managing cloud infrastructure
  • Responsible for any refactoring of code
  • Effectively manage challenges associated with handling large volumes of data working to tight deadlines
  • Manage expectations with internal stakeholders and context-switch in a fast-paced environment
  • Thrive in an environment that uses AWS and Elasticsearch extensively
  • Keep abreast of technology and contribute to the engineering strategy
  • Champion best development practices and provide mentorship to others
What are we looking for?
  • First and foremost you are a Python developer, experienced with the Python Data stack
  • You love and care about data
  • Your code is an artistic manifest reflecting how elegant you are in what you do
  • You feel sparks of joy when a new abstraction or pattern arises from your code
  • You support the manifests DRY (Don’t Repeat Yourself) and KISS (Keep It Short and Simple)
  • You are a continuous learner
  • You have a natural willingness to automate tasks
  • You have critical thinking and an eye for detail
  • Excellent ability and experience of working to tight deadlines
  • Sharp analytical and problem-solving skills
  • Strong sense of ownership and accountability for your work and delivery
  • Excellent written and oral communication skills
  • Mature collaboration and mentoring abilities
  • We are keen to know your digital footprint (community talks, blog posts, certifications, courses you have participated in or you are keen to, your personal projects as well as any kind of contributions to the open-source communities if any)
Nice to have:
  • Delivering complex software, ideally in a FinTech setting
  • Experience with CI/CD tools such as Jenkins, CircleCI
  • Experience with code versioning (git / mercurial / subversion)
Read more
DHL
at DHL
1 recruiter
Garima Saraswat
Posted by Garima Saraswat
Malaysia
2 - 4 yrs
₹12L - ₹15L / yr
Oracle BI Publisher
Oracle HCM
SQL Server Reporting Services (SSRS)
Business Intelligence (BI)
Oracle Business Intelligence Suite Enterprise Edition (OBIEE)
+3 more

Are you a motivated, organized person seeking a demanding and rewarding opportunity in a fast-paced environment? Would you enjoy being part of a dedicated team that works together to create a relevant, memorable difference in the lives of our customers and employees? If you're looking for change, and you're ready to make changes … we're looking for you.


This role is part of our global Team and will be responsible for driving our digitalization roadmap. You will be responsible for analyzing reporting requirements and defining solutions that meet or exceed those requirements. You will need to understand and apply systems analysis concepts and principles to effectively translate and validate business systems solutions. Further, you will apply IT and internal team methodologies and procedures to ensure solutions are defined in a consistent, standard and repeatable method.


Responsibilities

What are you accountable for achieving as Senior Oracle Fusion Reporting Specialist?

  • Design, Development and Support of Oracle Reporting applications and Dashboards.
  • Interact with internal stakeholders and translate business needs to technical specifications
  • Preparing BIP reports (Data Model and Report Templates)
  • Effectively deliver projects and ongoing support for Oracle HCM BI solutions.
  • Develop data models and reports.


Requirements

What will you need as a successful Senior Oracle Fusion Reporting Specialist / Developer?

  • Bachelor's Degree in Computer Science or more than 2 years of experience in business intelligence projects
  • Experience in programming with Oracle tools and in writing SQL Server/Oracle/SQL Stored Procedures and functions.
  • Experience in a BI environment.
  • Broad understanding of Oracle HCM Cloud Applications and database structure of the HCM application module.
  • Exposure to Oracle BI, Automation, JIRA and ETL will be added advantage.


Read more
Agilisium
Agency job
via Recruiting India by Moumita Santra
Chennai
10 - 19 yrs
₹12L - ₹40L / yr
Big Data
Apache Spark
Spark
PySpark
ETL
+1 more

Job Sector: IT, Software

Job Type: Permanent

Location: Chennai

Experience: 10 - 20 Years

Salary: 12 – 40 LPA

Education: Any Graduate

Notice Period: Immediate

Key Skills: Python, Spark, AWS, SQL, PySpark

Contact at triple eight two zero nine four two double seven

 

Job Description:

Requirements

  • Minimum 12 years experience
  • In depth understanding and knowledge on distributed computing with spark.
  • Deep understanding of Spark Architecture and internals
  • Proven experience in data ingestion, data integration and data analytics with spark, preferably PySpark.
  • Expertise in ETL processes, data warehousing and data lakes.
  • Hands on with python for Big data and analytics.
  • Hands on in agile scrum model is an added advantage.
  • Knowledge on CI/CD and orchestration tools is desirable.
  • AWS S3, Redshift, Lambda knowledge is preferred
Thanks
Read more
Remote only
12 - 20 yrs
₹24L - ₹40L / yr
API management
Windows Azure
Spring Boot
Microservices
Cloud Computing
+4 more

API Lead Developer

 

Job Overview:

As an API developer for a very large client,  you will be filling the role of a hands-on Azure API Developer. we are looking for someone who has the necessary technical expertise to build and maintain sustainable API Solutions to support identified needs and expectations from the client.

 

Delivery Responsibilities

  • Implement an API architecture using Azure API Management, including security, API Gateway, Analytics, and API Services
  • Design reusable assets, components, standards, frameworks, and processes to support and facilitate API and integration projects
  • Conduct functional, regression, and load testing on API’s
  • Gather requirements and defining the strategy for application integration
  • Develop using the following types of Integration protocols/principles: SOAP and Web services stack, REST APIs, RESTful, RPC/RFC
  • Analyze, design, and coordinate the development of major components of the APIs including hands on implementation, testing, review, build automation, and documentation
  • Work with DevOps team to package release components to deploy into higher environment

Required Qualifications

  • Expert Hands-on experience in the following:
    • Technologies such as Spring Boot, Microservices, API Management & Gateway, Event Streaming, Cloud-Native Patterns, Observability & Performance optimizations
    • Data modelling, Master and Operational Data Stores, Data ingestion & distribution patterns, ETL / ELT technologies, Relational and Non-Relational DB's, DB Optimization patterns
  • At least 5+ years of experience with Azure APIM
  • At least 8+ years’ experience in Azure SaaS and PaaS
  • At least 8+ years’ experience in API Management including technologies such as Mulesoft and Apigee
  • At least last 5 years in consulting with the latest implementation on Azure SaaS services
  • At least 5+ years in MS SQL / MySQL development including data modeling, concurrency, stored procedure development and tuning
  • Excellent communication skills with a demonstrated ability to engage, influence, and encourage partners and stakeholders to drive collaboration and alignment
  • High degree of organization, individual initiative, results and solution oriented, and personal accountability and resiliency
  • Should be a self-starter and team player, capable of working with a team of architects, co-developers, and business analysts

 

Preferred Qualifications:

  • Ability to work as a collaborative team, mentoring and training the junior team members
  • Working knowledge on building and working on/around data integration / engineering / Orchestration
  • Position requires expert knowledge across multiple platforms, integration patterns, processes, data/domain models, and architectures.
  • Candidates must demonstrate an understanding of the following disciplines: enterprise architecture, business architecture, information architecture, application architecture, and integration architecture.
  • Ability to focus on business solutions and understand how to achieve them according to the given timeframes and resources.
  • Recognized as an expert/thought leader. Anticipates and solves highly complex problems with a broad impact on a business area.
  • Experience with Agile Methodology / Scaled Agile Framework (SAFe).
  • Outstanding oral and written communication skills including formal presentations for all levels of management combined with strong collaboration/influencing.

 

Preferred Education/Skills:

  • Prefer Master’s degree
  • Bachelor’s Degree in Computer Science with a minimum of 12+ years relevant experience or equivalent.
Read more
Japanese MNC
Agency job
via CIEL HR Services by sundari chitra
Bengaluru (Bangalore)
5 - 10 yrs
₹7L - ₹15L / yr
ETL
Informatica
Data Warehouse (DWH)
IBM InfoSphere DataStage
Datastage
We are looking ETL Datastage developer for a Japanese MNC.

Role: ETL Datastage developer.

Eperience: 5 years

Location: Bangalore (WFH as of now).

Roles:

Design, develop, and schedule DataStage ETL jobs to extract data from disparate source systems, transform, and load data into EDW for data mart consumption, self-service analytics, and data visualization tools. 

Provides hands-on technical solutions to business challenges & translates them into process/technical solutions. 

Conduct code reviews to communicate high-level design approaches with team members to validate strategic business needs and architectural guidelines are met.

 Evaluate and recommend technical feasibility and effort estimates of proposed technology solutions. Provide operational instructions for dev, QA, and production code deployments while adhering to internal Change Management processes. 

Coordinate Control-M scheduler jobs and dependencies Recommend and implement ETL process performance tuning strategies and methodologies. Conducts and supports data validation, unit testing, and QA integration activities. 

Compose and update technical documentation to ensure compliance to department policies and standards. Create transformation queries, stored procedures for ETL processes, and development automations. 

Interested candidates can forward your profiles.
Read more
AYM Marketing Management
Stephen FitzGerald
Posted by Stephen FitzGerald
Remote only
2 - 8 yrs
₹10L - ₹25L / yr
SQL server
PowerBI
Spotfire
Qlikview
Tableau
+11 more

Senior Product Analyst

Pampers Start Up Team

India / Remote Working

 

 

Team Description

Our internal team focuses on App Development with data a growing area within the structure. We have a clear vision and strategy which is coupled up with App Development, Data, Testing, Solutions and Operations. The data team sits across the UK and India whilst other teams sit across Dubai, Lebanon, Karachi and various cities in India.

 

Role Description

In this role you will use a range of tools and technologies to primarily working on providing data design, data governance, reporting and analytics on the Pampers App.

 

This is a unique opportunity for an ambitious candidate to join a growing business where they will get exposure to a diverse set of assignments, can contribute fully to the growth of the business and where there are no limits to career progression and reward.

 

Responsibilities

● To be the Data Steward and drive governance having full understanding of all the data that flows through the Apps to all systems

● Work with the campaign team to do data fixes when issues with campaigns

● Investigate and troubleshoot issues with product and campaigns giving clear RCA and impact analysis

● Document data, create data dictionaries and be the “go to” person in understanding what data flows

● Build dashboards and reports using Amplitude, Power BI and present to the key stakeholders

● Carry out adhoc data investigations into issues with the app and present findings back querying data in BigQuery/SQL/CosmosDB

● Translate analytics into a clear powerpoint deck with actionable insights

● Write up clear documentation on processes

● Innovate with new processes or ways of providing analytics and reporting

● Help the data lead to find new ways of adding value

 

 

Requirements

● Bachelor’s degree and a minimum of 4+ years’ experience in an analytical role preferably working in product analytics with consumer app data

● Strong SQL Server and Power BI required

● You have experience with most or all of these tools – SQL Server, Python, Power BI, BigQuery.

● Understanding of mobile app data (Events, CTAs, Screen Views etc)

● Knowledge of data architecture and ETL

● Experience in analyzing customer behavior and providing insightful recommendations

● Self-starter, with a keen interest in technology and highly motivated towards success

● Must be proactive and be prepared to address meetings

● Must show initiative and desire to learn business subjects

● Able to work independently and provide updates to management

● Strong analytical and problem-solving capabilities with meticulous attention to detail

● Excellent problem-solving skills; proven teamwork and communication skills

● Experience working in a fast paced “start-up like” environment

 

Desirable

  • Knowledge of mobile analytical tools (Segment, Amplitude, Adjust, Braze and Google Analytics)
  • Knowledge of loyalty data
Read more
Remote, NCR (Delhi | Gurgaon | Noida)
3 - 12 yrs
₹8L - ₹14L / yr
Data Warehouse (DWH)
ETL
Amazon Redshift

Responsible for planning, connecting, designing, scheduling, and deploying data warehouse systems. Develops, monitors, and maintains ETL processes, reporting applications, and data warehouse design.

Role and Responsibility

·         Plan, create, coordinate, and deploy data warehouses.

·         Design end user interface.

·         Create best practices for data loading and extraction.

·         Develop data architecture, data modeling, and ETFL mapping solutions within structured data warehouse environment.

·         Develop reporting applications and data warehouse consistency.

·         Facilitate requirements gathering using expert listening skills and develop unique simple solutions to meet the immediate and long-term needs of business customers.

·         Supervise design throughout implementation process.

·         Design and build cubes while performing custom scripts.

·         Develop and implement ETL routines according to the DWH design and architecture.

·         Support the development and validation required through the lifecycle of the DWH and Business Intelligence systems, maintain user connectivity, and provide adequate security for data warehouse.

·         Monitor the DWH and BI systems performance and integrity provide corrective and preventative maintenance as required.

·         Manage multiple projects at once.

DESIRABLE SKILL SET

·         Experience with technologies such as MySQL, MongoDB, SQL Server 2008, as well as with newer ones like SSIS and stored procedures

·         Exceptional experience developing codes, testing for quality assurance, administering RDBMS, and monitoring of database

·         High proficiency in dimensional modeling techniques and their applications

·         Strong analytical, consultative, and communication skills; as well as the ability to make good judgment and work with both technical and business personnel

·         Several years working experience with Tableau,  MicroStrategy, Information Builders, and other reporting and analytical tools

·         Working knowledge of SAS and R code used in data processing and modeling tasks

·         Strong experience with Hadoop, Impala, Pig, Hive, YARN, and other “big data” technologies such as AWS Redshift or Google Big Data

 

Read more
Qvantel Software Solutions Ltd
Srinivas Bollipally
Posted by Srinivas Bollipally
Hyderabad
3 - 7 yrs
₹6L - ₹20L / yr
Data Migration
BSS
ETL
We are now looking for passionate DATA MIGRATION DEVELOPERS to work in our Hyderabad site Role Description: We are looking for data migration developers to our BSS delivery projects. Your main goal is to analyse migration data, create migration solution and execute the data migration. You will work as part of the migration team in cooperation with our migration architect and BSS delivery project manager. You have a solid background with telecom BSS and experience in data migrations. You will be expected to interpret data analysis produced by Business Analysts and raise issues or questions and work directly with the client on-site to resolve them. You must therefore be capable of understanding the telecom business behind a technical solution. Requirements: – To understand different data migration approaches and capability to adopt requirements to migration tool development and utilization – Capability to analyse the shape & health of source data – Extraction of data from multiple legacy sources – Building transformation code to adhere to data mappings – Loading data to either new or existing target solutions. We appreciate: – Deep knowledge of ETL processes and/or other migration tools – Proven experience in data migrations with high volumes and in business critical systems in telecom business – Experience in telecom business support systems – Ability to apply innovation and improvement to the data migration/support processes and to be able to manage multiple priorities effectively. We can offer you: – Interesting and challenging work in a fast-growing, customer-oriented company – An international and multicultural working environment with experienced and enthusiastic colleagues – Plenty of opportunities to learn, grow and progress in your career At Qvantel we have built a young, dynamic culture where people are motivated to learn and develop themselves, are used to working both independently as well as in teams, have a systematic, hands on working style and a can-do attitude. Our people are used to communicate across other cultures and time zones. A sense of humor can also come in handy. Don’t hesitate to ask for more information from Srinivas Bollipally our Recruitment Specialist reachable at [email protected]
Read more
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos