- R Programming: A strong understanding of R programming is essential for developing and maintaining RShiny applications.
- Shiny Framework: Proficiency in the Shiny framework, including creating interactive web applications and understanding reactive programming concepts.
- Data Visualization: Ability to create informative and visually appealing charts and plots using packages like ggplot2 and Plotly.
- R Package Management: Familiarity with tools like renv or packrat for managing package dependencies within R projects.
- ETL (Extract, Transform, Load): Knowledge of data extraction, transformation, and loading processes using Databricks.
- Debugging: Troubleshooting skills to identify and resolve issues in RShiny apps and Databricks pipelines.
- Documentation: Documenting code, configurations, and processes to ensure knowledge sharing and ease of maintenance.
- Training: Ability to conduct training on the above when needed.
Subodh PopalwarSoftware Engineer, Memorres
About Wallero technologies
Wallero is a global leader in providing business solutions, IT services, consulting; with a large network of innovation & delivery centers. Know more!
Job Description - Jasper
- Knowledge of Jasper report server administration, installation and configuration
- Knowledge of report deployment and configuration
- Knowledge of Jaspersoft Architecture and Deployment
- Knowledge of User Management in Jaspersoft Server
- Experience in developing Complex Reports using Jaspersoft Server and Jaspersoft Studio
- Understand the Overall architecture of Jaspersoft BI
- Experience in creating Ad Hoc Reports, OLAP, Views, Domains
- Experience in report server (Jaspersoft) integration with web application
- Experience in JasperReports Server web services API and Jaspersoft Visualise.JS Web service API
- Experience in creating dashboards with visualizations
- Experience in security and auditing, metadata layer
- Experience in Interacting with stakeholders for requirement gathering and Analysis
- Good knowledge ETL design and development, logical and physical data modeling (relational and dimensional)
- Strong self- initiative to strive for both personal & technical excellence.
- Coordinate efforts across Product development team and Business Analyst team.
- Strong business and data analysis skills.
- Domain knowledge of Healthcare an advantage.
- Should be strong on Co- ordinate with onshore resources on development.
- Data oriented professional with good communications skills and should have a great eye for detail.
- Interpret data, analyze results and provide insightful inferences
- Maintain relationship with Business Intelligence stakeholders
- Strong Analytical and Problem Solving skills
Fix issues with plugins for our Python-based ETL pipelines
Help with automation of standard workflow
Deliver Python microservices for provisioning and managing cloud infrastructure
Responsible for any refactoring of code
Effectively manage challenges associated with handling large volumes of data working to tight deadlines
Manage expectations with internal stakeholders and context-switch in a fast-paced environment
Thrive in an environment that uses AWS and Elasticsearch extensively
Keep abreast of technology and contribute to the engineering strategy
Champion best development practices and provide mentorship to others
First and foremost you are a Python developer, experienced with the Python Data stack
You love and care about data
Your code is an artistic manifest reflecting how elegant you are in what you do
You feel sparks of joy when a new abstraction or pattern arises from your code
You support the manifests DRY (Don’t Repeat Yourself) and KISS (Keep It Short and Simple)
You are a continuous learner
You have a natural willingness to automate tasks
You have critical thinking and an eye for detail
Excellent ability and experience of working to tight deadlines
Sharp analytical and problem-solving skills
Strong sense of ownership and accountability for your work and delivery
Excellent written and oral communication skills
Mature collaboration and mentoring abilities
We are keen to know your digital footprint (community talks, blog posts, certifications, courses you have participated in or you are keen to, your personal projects as well as any kind of contributions to the open-source communities if any)
Delivering complex software, ideally in a FinTech setting
Experience with CI/CD tools such as Jenkins, CircleCI
Experience with code versioning (git / mercurial / subversion)
Are you a motivated, organized person seeking a demanding and rewarding opportunity in a fast-paced environment? Would you enjoy being part of a dedicated team that works together to create a relevant, memorable difference in the lives of our customers and employees? If you're looking for change, and you're ready to make changes … we're looking for you.
This role is part of our global Team and will be responsible for driving our digitalization roadmap. You will be responsible for analyzing reporting requirements and defining solutions that meet or exceed those requirements. You will need to understand and apply systems analysis concepts and principles to effectively translate and validate business systems solutions. Further, you will apply IT and internal team methodologies and procedures to ensure solutions are defined in a consistent, standard and repeatable method.
What are you accountable for achieving as Senior Oracle Fusion Reporting Specialist?
- Design, Development and Support of Oracle Reporting applications and Dashboards.
- Interact with internal stakeholders and translate business needs to technical specifications
- Preparing BIP reports (Data Model and Report Templates)
- Effectively deliver projects and ongoing support for Oracle HCM BI solutions.
- Develop data models and reports.
What will you need as a successful Senior Oracle Fusion Reporting Specialist / Developer?
- Bachelor's Degree in Computer Science or more than 2 years of experience in business intelligence projects
- Experience in programming with Oracle tools and in writing SQL Server/Oracle/SQL Stored Procedures and functions.
- Experience in a BI environment.
- Broad understanding of Oracle HCM Cloud Applications and database structure of the HCM application module.
- Exposure to Oracle BI, Automation, JIRA and ETL will be added advantage.
Job Sector: IT, Software
Job Type: Permanent
Experience: 10 - 20 Years
Salary: 12 – 40 LPA
Education: Any Graduate
Notice Period: Immediate
Key Skills: Python, Spark, AWS, SQL, PySpark
Contact at triple eight two zero nine four two double seven
- Minimum 12 years experience
- In depth understanding and knowledge on distributed computing with spark.
- Deep understanding of Spark Architecture and internals
- Proven experience in data ingestion, data integration and data analytics with spark, preferably PySpark.
- Expertise in ETL processes, data warehousing and data lakes.
- Hands on with python for Big data and analytics.
- Hands on in agile scrum model is an added advantage.
- Knowledge on CI/CD and orchestration tools is desirable.
- AWS S3, Redshift, Lambda knowledge is preferred
API Lead Developer
As an API developer for a very large client, you will be filling the role of a hands-on Azure API Developer. we are looking for someone who has the necessary technical expertise to build and maintain sustainable API Solutions to support identified needs and expectations from the client.
- Implement an API architecture using Azure API Management, including security, API Gateway, Analytics, and API Services
- Design reusable assets, components, standards, frameworks, and processes to support and facilitate API and integration projects
- Conduct functional, regression, and load testing on API’s
- Gather requirements and defining the strategy for application integration
- Develop using the following types of Integration protocols/principles: SOAP and Web services stack, REST APIs, RESTful, RPC/RFC
- Analyze, design, and coordinate the development of major components of the APIs including hands on implementation, testing, review, build automation, and documentation
- Work with DevOps team to package release components to deploy into higher environment
- Expert Hands-on experience in the following:
- Technologies such as Spring Boot, Microservices, API Management & Gateway, Event Streaming, Cloud-Native Patterns, Observability & Performance optimizations
- Data modelling, Master and Operational Data Stores, Data ingestion & distribution patterns, ETL / ELT technologies, Relational and Non-Relational DB's, DB Optimization patterns
- At least 5+ years of experience with Azure APIM
- At least 8+ years’ experience in Azure SaaS and PaaS
- At least 8+ years’ experience in API Management including technologies such as Mulesoft and Apigee
- At least last 5 years in consulting with the latest implementation on Azure SaaS services
- At least 5+ years in MS SQL / MySQL development including data modeling, concurrency, stored procedure development and tuning
- Excellent communication skills with a demonstrated ability to engage, influence, and encourage partners and stakeholders to drive collaboration and alignment
- High degree of organization, individual initiative, results and solution oriented, and personal accountability and resiliency
- Should be a self-starter and team player, capable of working with a team of architects, co-developers, and business analysts
- Ability to work as a collaborative team, mentoring and training the junior team members
- Working knowledge on building and working on/around data integration / engineering / Orchestration
- Position requires expert knowledge across multiple platforms, integration patterns, processes, data/domain models, and architectures.
- Candidates must demonstrate an understanding of the following disciplines: enterprise architecture, business architecture, information architecture, application architecture, and integration architecture.
- Ability to focus on business solutions and understand how to achieve them according to the given timeframes and resources.
- Recognized as an expert/thought leader. Anticipates and solves highly complex problems with a broad impact on a business area.
- Experience with Agile Methodology / Scaled Agile Framework (SAFe).
- Outstanding oral and written communication skills including formal presentations for all levels of management combined with strong collaboration/influencing.
- Prefer Master’s degree
- Bachelor’s Degree in Computer Science with a minimum of 12+ years relevant experience or equivalent.
Role: ETL Datastage developer.
Eperience: 5 years
Location: Bangalore (WFH as of now).
Design, develop, and schedule DataStage ETL jobs to extract data from disparate source systems, transform, and load data into EDW for data mart consumption, self-service analytics, and data visualization tools.
Provides hands-on technical solutions to business challenges & translates them into process/technical solutions.
Conduct code reviews to communicate high-level design approaches with team members to validate strategic business needs and architectural guidelines are met.
Evaluate and recommend technical feasibility and effort estimates of proposed technology solutions. Provide operational instructions for dev, QA, and production code deployments while adhering to internal Change Management processes.
Coordinate Control-M scheduler jobs and dependencies Recommend and implement ETL process performance tuning strategies and methodologies. Conducts and supports data validation, unit testing, and QA integration activities.
Compose and update technical documentation to ensure compliance to department policies and standards. Create transformation queries, stored procedures for ETL processes, and development automations.
Interested candidates can forward your profiles.
Senior Product Analyst
Pampers Start Up Team
India / Remote Working
Our internal team focuses on App Development with data a growing area within the structure. We have a clear vision and strategy which is coupled up with App Development, Data, Testing, Solutions and Operations. The data team sits across the UK and India whilst other teams sit across Dubai, Lebanon, Karachi and various cities in India.
In this role you will use a range of tools and technologies to primarily working on providing data design, data governance, reporting and analytics on the Pampers App.
This is a unique opportunity for an ambitious candidate to join a growing business where they will get exposure to a diverse set of assignments, can contribute fully to the growth of the business and where there are no limits to career progression and reward.
● To be the Data Steward and drive governance having full understanding of all the data that flows through the Apps to all systems
● Work with the campaign team to do data fixes when issues with campaigns
● Investigate and troubleshoot issues with product and campaigns giving clear RCA and impact analysis
● Document data, create data dictionaries and be the “go to” person in understanding what data flows
● Build dashboards and reports using Amplitude, Power BI and present to the key stakeholders
● Carry out adhoc data investigations into issues with the app and present findings back querying data in BigQuery/SQL/CosmosDB
● Translate analytics into a clear powerpoint deck with actionable insights
● Write up clear documentation on processes
● Innovate with new processes or ways of providing analytics and reporting
● Help the data lead to find new ways of adding value
● Bachelor’s degree and a minimum of 4+ years’ experience in an analytical role preferably working in product analytics with consumer app data
● Strong SQL Server and Power BI required
● You have experience with most or all of these tools – SQL Server, Python, Power BI, BigQuery.
● Understanding of mobile app data (Events, CTAs, Screen Views etc)
● Knowledge of data architecture and ETL
● Experience in analyzing customer behavior and providing insightful recommendations
● Self-starter, with a keen interest in technology and highly motivated towards success
● Must be proactive and be prepared to address meetings
● Must show initiative and desire to learn business subjects
● Able to work independently and provide updates to management
● Strong analytical and problem-solving capabilities with meticulous attention to detail
● Excellent problem-solving skills; proven teamwork and communication skills
● Experience working in a fast paced “start-up like” environment
- Knowledge of mobile analytical tools (Segment, Amplitude, Adjust, Braze and Google Analytics)
- Knowledge of loyalty data
Responsible for planning, connecting, designing, scheduling, and deploying data warehouse systems. Develops, monitors, and maintains ETL processes, reporting applications, and data warehouse design.
Role and Responsibility
· Plan, create, coordinate, and deploy data warehouses.
· Design end user interface.
· Create best practices for data loading and extraction.
· Develop data architecture, data modeling, and ETFL mapping solutions within structured data warehouse environment.
· Develop reporting applications and data warehouse consistency.
· Facilitate requirements gathering using expert listening skills and develop unique simple solutions to meet the immediate and long-term needs of business customers.
· Supervise design throughout implementation process.
· Design and build cubes while performing custom scripts.
· Develop and implement ETL routines according to the DWH design and architecture.
· Support the development and validation required through the lifecycle of the DWH and Business Intelligence systems, maintain user connectivity, and provide adequate security for data warehouse.
· Monitor the DWH and BI systems performance and integrity provide corrective and preventative maintenance as required.
· Manage multiple projects at once.
DESIRABLE SKILL SET
· Experience with technologies such as MySQL, MongoDB, SQL Server 2008, as well as with newer ones like SSIS and stored procedures
· Exceptional experience developing codes, testing for quality assurance, administering RDBMS, and monitoring of database
· High proficiency in dimensional modeling techniques and their applications
· Strong analytical, consultative, and communication skills; as well as the ability to make good judgment and work with both technical and business personnel
· Several years working experience with Tableau, MicroStrategy, Information Builders, and other reporting and analytical tools
· Working knowledge of SAS and R code used in data processing and modeling tasks
· Strong experience with Hadoop, Impala, Pig, Hive, YARN, and other “big data” technologies such as AWS Redshift or Google Big Data