Experience level: Minimum 10 years
This role would be leading the DBA teams of multiple experience level DBAs for a mix of – Teradata, Oracle and SQL.
Minimum 10 years of relevant Database and Datawarehouse experience.
Hands on experience of administrating Teradata.
Leading the performance analysis, capacity planning and supporting the batchops and users with their jobs.
Drive implementation of standards and best practices to optimize database utilization and availability.
Hands on with AWS Cloud infrastructure services such as EC2, S3 and network services.
Proficient in Linux system administration relevant to Teradata management.
Teradata Specific (Mandatory)
Manage and Operate 24x7 production as well as development databases to ensure maximum availability of system resources.
Responsible for operational activities of a Database Administrator such as System monitoring, User Management, Space Management, Troubleshooting, and Batch/user support.
Perform DBA related tasks in key areas of Performance Management & Reporting, workload management using TASM.
Manage Production/Development databases in areas like Capacity Planning, Performance Monitoring & Tuning, Strategies Defined for Backup/Recovery Techniques, Space/ User/ Security management along With Problem determination and resolution.
Experience with Teradata Workload management & monitoring and query optimization.
Expertise with system monitoring using viewpoint and logs.
Proficient in analysing the performance and optimizing at different levels.
Ability to create advanced system-level capacity reports as well as root cause analysis.
Oracle Specific (Optional)
Database Administration Installation of Oracle software on Unix/Linux platform.
Database Lifecycle Management - Database creation, setup decommissioning.
Database event alert monitoring, space management, user management.
Database upgrades migrations, cloning.
Database backup restore recovery using RMAN.
Setup and maintain High-Availability and Disaster Recovery solutions.
Proficient in Standby and Data Guard technology.
Hands on with the OEM CC.
- Teradata Vantage Certified Administrator
Operates in over 25 countries across six continents and is part of Publicis Media, one of four solution hubs within Publicis Groupe, which is present in over 100 countries and employs nearly 80,000 professionals.
We believe there are better ways for brands to connect with people. And we’re on a mission to guide brands to better connections -- across consumers, channels and partners. These are just some of the services we offer our clients in our quest to deliver ambitious outcomes.
- Servlet and JSP development
- AJAX, jQuery, EXTJS
- Web services creation and consumption
- CMS development experience
- Java Content Repository (JCR)/CRX
- Eclipse IDE
- Apache Sling
- Apache Web Server
- Provide Voice and Remote terminal support for Level 1/2 backup and restore Issues and submit stars tickets
- Configuration, Troubleshooting and assisting setup of backup Environment.
- Follow the Alert Management Strategy for category Level 1/2 on various system/server alerts received every day
- First and second level troubleshooting of the backup and restore issues
- Update & Maintain Pertinent technical and Operational Documentation
- Open support cases and work with vendors such as Oracle / IBM and Symantec/Iron Mountain to provide necessary logs/information on issues raised by
- Coordinate fix of H/W issues with vendors on backup servers following Hardware management processes and procedures
- Working with tape library teams on tape management
- Installation and configuration of backup software on servers and clients.
- Checking the failed backups for the day and rerunning them to meet SLA
- Setting up or Assisting in LAB/Dev machine needs for test purpose etc
- Requesting tapes from offsite location to facilitate restore
- Housekeeping of User or System Environment (Ex: work on heavy hitters and try to validate the exclude list etc)
- Work on RPO ( Recovery Point Objective ) and failure reports
- Backup policies and storage media management
- Daily operational checklists
- Notifications to a Group of Users or IT Admins mailing list
- Work on Change management windows
- Generate multiple reports and metrics as required for operations and planning
- Strive for improvement in operational and business value.
We are looking for an exceptionally talented Lead data engineer who has exposure in implementing AWS services to build data pipelines, api integration and designing data warehouse. Candidate with both hands-on and leadership capabilities will be ideal for this position.
Qualification: At least a bachelor’s degree in Science, Engineering, Applied Mathematics. Preferred Masters degree
• Total 6+ years of experience as a Data Engineer and 2+ years of experience in managing a team
• Have minimum 3 years of AWS Cloud experience.
• Well versed in languages such as Python, PySpark, SQL, NodeJS etc
• Has extensive experience in the real-timeSpark ecosystem and has worked on both real time and batch processing
• Have experience in AWS Glue, EMR, DMS, Lambda, S3, DynamoDB, Step functions, Airflow, RDS, Aurora etc.
• Experience with modern Database systems such as Redshift, Presto, Hive etc.
• Worked on building data lakes in the past on S3 or Apache Hudi
• Solid understanding of Data Warehousing Concepts
• Good to have experience on tools such as Kafka or Kinesis
• Good to have AWS Developer Associate or Solutions Architect Associate Certification
• Have experience in managing a team
Taking ownership of customer issues reported and seeing problems through to resolution
Understand, interpret, reproduce, and diagnose issues reported by the customers.
Researching, troubleshooting and identifying solutions to resolve product issues
Should be able to handle voice calls, emails and online web sessions with the customer as part of technical troubleshooting
Should exhibit patience and empathy while working with customer with an aim to drive positive customer experience
Following standard procedures for proper escalation of unresolved issues to the appropriate internal teams
Following standard procedures for submitting tickets to the engineering team for further investigation of unresolved issues
Contributing actively towards knowledge base articles
Adherence to strict SLA’s
Ready to work in early morning, general and afternoon shifts, including weekends, on rotation basis
Should demonstrate an aptitude and appetite for learning newer technologies while expanding on the core knowledge
The engineer should be available to work in the late evening shift (4pm - 1am) if required.
3-8 years of relevant experience.
Strong technical knowledge on:
- Cloud Technologies – AWS, Azure etc.
- Databases – SQL, Oracle, MySQL etc.
- Operating Systems – Linux, Windows etc
- Networking – Basic networking concepts and troubleshooting
- Programming knowledge – Java, Python (Desirable, not a must has) o Prior experience with REST and SOAP calls.
Excellent communication skills – English written and verbal
HTTP technology and principles, including REST and SOAP principles (Required)
Good understanding of networking protocols and applications (TCP/IP, proxies, load balancing, firewalls, etc.) (Required)
Working knowledge of database technologies and SQL (Required)
In-depth familiarity of Linux (Required) (advanced user; sysadmin experience a bonus, but not required)
Strong analytical and logical reasoning for technical troubleshooting
Ability to collaborate with cross-functional teams – Dev, QA, Infrastructure teams etc.
Should be a team player who keeps team’s success before individual achievements
Knowledge on data integration (Informatica/ Mulesoft)
- Provide daily support with resolution of escalated tickets and act as liaison to business and technical leads to ensure issues are resolved in timely manner.
- Incident resolution and supporting production system deployments.
- Suggest fixes to complex issues by doing a thorough analysis of root cause and impact of the defect.
- Support and deliver within Continuous Integration/Continuous Delivery pipelines.
- Prioritise workload, providing timely and accurate resolutions.
- Perform production support activities which involve assignment of issues and issue analysis and resolution within the specified SLAs.
- Understand linux. SSH to linux box, look for web logs etc
- Understand web apps to be able to troubleshoot issues
- Good to have programming experience with Python.
- You should not be afraid to do some development as well as Devops.
- Clear written and oral communication is a must.
1. 4-6 years experience as oracle apps technical consultant (techno functional is preferred )
2. should have worked on interfaces , OAF, form personalization’s ,workflow builder, oracle reports and xml publisher...
3. Should have good understanding of Order to cash and Procure to pay technical flow of Ebiz
4. should have worked on 2-3 implementation projects and have strong technical knowledge
5. Strong in Pl/sql , Unix and worked on custom solutions
6. Able to provide technical solutions
7. Good communication skills
Founded by two MDI alumnus, it is a student centric and personalised learning platform that delivers enjoyable learning content as per the state boards. This ed-tech provides a solution which is easy to use, lets students enjoy learning, makes life easy for a teacher and delivers learning in the language that students are most comfortable. The organisation has worked in 14 states across India and awarded Google India under "Impacting Change through Digital".
- Taking ownership of the current web based reporting backends of our current solutions/ apps and developing/ stabilizing/ scaling them further as per product plans
- Ensuring development and ongoing growth of a web based content management engine as the central node of our apps and tablet based learning products.
- Licensing, Encryption, Reporting & Analytics for all our products have to be integrated into a single user and licensing management backend.
Desired Candidate Profile
What you need to have:
- Deep and Practical experience with Node. JS (primary work component), Firebase, AWS/ Heroku, SQL/ NoSQL Databases with Server side integration.
- 2-3 years of deep hands-on experience with web based solutions, backend development and web development in the above mentioned tech.
- Should have worked on a web based backend in Node.JS for preferably an android app based solution which had scaled to a large number of users.
- Developed, managed and scaled usage/ analytics and reporting of end user based solutions.
- Should have developed and managed APIs.
- Should have practical experience of integrating and managing online payment systems.
- Built and managed online learning platforms with login, video security, online payment and reports.
• No. of Openings:- 03 Openings
• Exp:- 2+ years
• Qualification:- B.E./ MCA/ M.Sc. (I.T.)/ BCA
We are looking for a PHP Developer with Laravel Experience who must be able to independently work on projects and in a team.
Desired Candidate Profile:-
• Minimum 2 years of experience
• Good Knowledge of OOPs.
• Experienced with complex SQL queries and database schema design.
• Knowledge of consuming and creating Web Services.
• MySQL and Linux understanding is a must.
• Should be strong with Web-services REST API, and JSON.
• Should have worked on a variety of projects, with experience in payment gateways, maps, graphs, etc.
• Good coding standards is a must.
• Should have experience with collaboration tools
REQUIRED CITIZENSHIP / WORK PERMIT / VISA STATUS:
Should be currently based in Japan with Valid work visa
- * Mandatory Skills - Mainframe, Cobol, JCL ,DB2
- * Should have working knowledge on at least one database – MS SQL server, Oracle, DB2, PostgreSQL, MongoDB, Firebird SQL diagnose
- * Language Skills - JLTP N3 would be fine if the
- · Graduate with 6+ years of experience in Mainframe based applications development, maintenance & run mode operations.
- · Ability to collaborate with a diversity of customer's teams to address important issues and to resolve them.
- · Experienced in collaborating with internal and external stakeholders, adaption and learning of the business processes.
- · Should have experience working with Mainframe, Cobol, JCL ,DB2.
- · Working knowledge on at least one database – MS SQL server, Oracle, DB2, PostgreSQL, MongoDB, Firebird SQL diagnose, troubleshoot and fix complex issues.
Only Japan based candidates with relevant experience can apply.
Your mission is to help lead team towards creating solutions that improve the way our business is run. Your knowledge of design, development, coding, testing and application programming will help your team raise their game, meeting your standards, as well as satisfying both business and functional requirements. Your expertise in various technology domains will be counted on to set strategic direction and solve complex and mission critical problems, internally and externally. Your quest to embracing leading-edge technologies and methodologies inspires your team to follow suit.
Responsibilities and Duties :
- As a Data Engineer you will be responsible for the development of data pipelines for numerous applications handling all kinds of data like structured, semi-structured &
unstructured. Having big data knowledge specially in Spark & Hive is highly preferred.
- Work in team and provide proactive technical oversight, advice development teams fostering re-use, design for scale, stability, and operational efficiency of data/analytical solutions
Education level :
- Bachelor's degree in Computer Science or equivalent
- Minimum 3+ years relevant experience working on production grade projects experience in hands on, end to end software development
- Expertise in application, data and infrastructure architecture disciplines
- Expert designing data integrations using ETL and other data integration patterns
- Advanced knowledge of architecture, design and business processes
Proficiency in :
- Modern programming languages like Java, Python, Scala
- Big Data technologies Hadoop, Spark, HIVE, Kafka
- Writing decently optimized SQL queries
- Orchestration and deployment tools like Airflow & Jenkins for CI/CD (Optional)
- Responsible for design and development of integration solutions with Hadoop/HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions
- Knowledge of system development lifecycle methodologies, such as waterfall and AGILE.
- An understanding of data architecture and modeling practices and concepts including entity-relationship diagrams, normalization, abstraction, denormalization, dimensional
modeling, and Meta data modeling practices.
- Experience generating physical data models and the associated DDL from logical data models.
- Experience developing data models for operational, transactional, and operational reporting, including the development of or interfacing with data analysis, data mapping,
and data rationalization artifacts.
- Experience enforcing data modeling standards and procedures.
- Knowledge of web technologies, application programming languages, OLTP/OLAP technologies, data strategy disciplines, relational databases, data warehouse development and Big Data solutions.
- Ability to work collaboratively in teams and develop meaningful relationships to achieve common goals
Must Know :
- Core big-data concepts
- Spark - PySpark/Scala
- Data integration tool like Pentaho, Nifi, SSIS, etc (at least 1)
- Handling of various file formats
- Cloud platform - AWS/Azure/GCP
- Orchestration tool - Airflow