11+ Teradata DBA Jobs in India
Apply to 11+ Teradata DBA Jobs on CutShort.io. Find your next job, effortlessly. Browse Teradata DBA Jobs and apply today!
![icon](/_next/image?url=https%3A%2F%2Fcdn.cutshort.io%2Fpublic%2Fimages%2Fsearch.png&w=48&q=75)
Position Description
We are looking for a highly motivated, hands-on Sr. Database/Data Warehouse Data Analytics developer to work at our Bangalore, India location. Ideal candidate will have solid software technology background with capability in the making and supporting of robust, secure, and multi-platform financial applications to contribute to Fully Paid Lending (FPL) project. The successful candidate will be a proficient and productive developer, a team leader, have good communication skills, and demonstrate ownership.
Responsibilities
- Produce service metrics, analyze trends, and identify opportunities to improve the level of service and reduce cost as appropriate.
- Responsible for design, development and maintenance of database schema and objects throughout the lifecycle of the applications.
- Supporting implemented solutions by monitoring and tuning queries and data loads, addressing user questions concerning data integrity, monitoring performance, and communicating functional and technical issues.
- Helping the team by taking care of production releases.
- Troubleshoot data issues and work with data providers for resolution.
- Closely work with business and applications teams in implementing the right design and solution for the business applications.
- Build reporting solutions for WM Risk Applications.
- Work as part of a banking Agile Squad / Fleet.
- Perform proof of concepts in new areas of development.
- Support continuous improvement of automated systems.
- Participate in all aspects of SDLC (analysis, design, coding, testing and implementation).
Required Skill
- 5 to 7 Years of strong database (SQL) Knowledge, ETL (Informatica PowerCenter), Unix Shell Scripting.
- Database (preferably Teradata) knowledge, database design, performance tuning, writing complex DB programs etc.
- Demonstrate proficient skills in analysis and resolution of application performance problems.
- Database fundamentals; relational and Datawarehouse concepts.
- Should be able to lead a team of 2-3 members and guide them in their day to work technically and functionally.
- Ensure designs, code and processes are optimized for performance, scalability, security, reliability, and maintainability.
- Understanding of requirements of large enterprise applications (security, entitlements, etc.)
- Provide technical leadership throughout the design process and guidance with regards to practices, procedures, and techniques. Serve as a guide and mentor for junior level Software Development Engineers
- Exposure to JIRA or other ALM tools to create a productive, high-quality development environment.
- Proven experience in working within an Agile framework.
- Strong problem-solving skills and the ability to produce high quality work independently and work well in a team.
- Excellent communication skills (written, interpersonal, presentation), with the ability to easily and effectively interact and negotiate with business stakeholders.
- Ability and strong desire to learn new languages, frameworks, tools, and platforms quickly.
- Growth mindset, personal excellence, collaborative spirit
Good to have skills.
- Prior work experience with Azure or other cloud platforms such as Google Cloud, AWS, etc.
- Exposure to programming languages python/R/ java and experience with implementing Data analytics projects.
- Experience in Git and development workflows.
- Prior experience in Banking and Financial domain.
- Exposure to security-based lending is a plus.
- Experience with Reporting/BI Tools is a plus.
Intuitive cloud (http://www.intuitive.cloud">www.intuitive.cloud) is one of the fastest growing top-tier Cloud Solutions and SDx Engineering solution and service company supporting 80+ Global Enterprise Customer across Americas, Europe and Middle East.
Intuitive is a recognized professional and manage service partner for core superpowers in cloud(public/ Hybrid), security, GRC, DevSecOps, SRE, Application modernization/ containers/ K8 -as-a- service and cloud application delivery.
Data Engineering:
- 9+ years’ experience as data engineer.
- Must have 4+ Years in implementing data engineering solutions with Databricks.
- This is hands on role building data pipelines using Databricks. Hands-on technical experience with Apache Spark.
- Must have deep expertise in one of the programming languages for data processes (Python, Scala). Experience with Python, PySpark, Hadoop, Hive and/or Spark to write data pipelines and data processing layers
- Must have worked with relational databases like Snowflake. Good SQL experience for writing complex SQL transformation.
- Performance Tuning of Spark SQL running on S3/Data Lake/Delta Lake/ storage and Strong Knowledge on Databricks and Cluster Configurations.
- Hands on architectural experience
- Nice to have Databricks administration including security and infrastructure features of Databricks.
Are you a motivated, organized person seeking a demanding and rewarding opportunity in a fast-paced environment? Would you enjoy being part of a dedicated team that works together to create a relevant, memorable difference in the lives of our customers and employees? If you're looking for change, and you're ready to make changes … we're looking for you.
This role is part of our global Team and will be responsible for driving our digitalization roadmap. You will be responsible for analyzing reporting requirements and defining solutions that meet or exceed those requirements. You will need to understand and apply systems analysis concepts and principles to effectively translate and validate business systems solutions. Further, you will apply IT and internal team methodologies and procedures to ensure solutions are defined in a consistent, standard and repeatable method.
Responsibilities
What are you accountable for achieving as Senior Oracle Fusion Reporting Specialist?
- Design, Development and Support of Oracle Reporting applications and Dashboards.
- Interact with internal stakeholders and translate business needs to technical specifications
- Preparing BIP reports (Data Model and Report Templates)
- Effectively deliver projects and ongoing support for Oracle HCM BI solutions.
- Develop data models and reports.
Requirements
What will you need as a successful Senior Oracle Fusion Reporting Specialist / Developer?
- Bachelor's Degree in Computer Science or more than 2 years of experience in business intelligence projects
- Experience in programming with Oracle tools and in writing SQL Server/Oracle/SQL Stored Procedures and functions.
- Experience in a BI environment.
- Broad understanding of Oracle HCM Cloud Applications and database structure of the HCM application module.
- Exposure to Oracle BI, Automation, JIRA and ETL will be added advantage.
- Designing and coding the data warehousing system to desired company specifications
- Conducting preliminary testing of the warehousing environment before data is extracted
- Extracting company data and transferring it into the new warehousing environment
- Testing the new storage system once all the data has been transferred
- Troubleshooting any issues that may arise
- Providing maintenance support
- Consulting with data management teams to get a big-picture idea of the company’s data storage needs
- Presenting the company with warehousing options based on their storage needs
- Experience of 1-3 years in Informatica Power Center
- Excellent knowledge in Oracle database and Pl-SQL such - Stored Procs, Functions, User Defined Functions, table partition, Index, views etc.
- Knowledge of SQL Server database
- Hands on experience in Informatica Power Center and Database performance tuning, optimization including complex Query optimization techniques Understanding of ETL Control Framework
- Experience in UNIX shell/Perl Scripting
- Good communication skills, including the ability to write clearly
- Able to function effectively as a member of a team
- Proactive with respect to personal and technical development
Pingahla is recruiting business intelligence Consultants/Senior consultants who can help us with Information Management projects (domestic, onshore and offshore) as developers and team leads. The candidates are expected to have 3-6 years of experience with Informatica Power Center/Talend DI/Informatica Cloud and must be very proficient with Business Intelligence in general. The job is based out of our Pune office.
Responsibilities:
- Manage the customer relationship by serving as the single point of contact before, during and after engagements.
- Architect data management solutions.
- Provide technical leadership to other consultants and/or customer/partner resources.
- Design, develop, test and deploy data integration solutions in accordance with customer’s schedule.
- Supervise and mentor all intermediate and junior level team members.
- Provide regular reports to communicate status both internally and externally.
- Qualifications:
- A typical profile that would suit this position would be if the following background:
- A graduate from a reputed engineering college
- An excellent I.Q and analytical skills and should be able to grasp new concepts and learn new technologies.
- A willingness to work with a small team in a fast-growing environment.
- A good knowledge of Business Intelligence concepts
Mandatory Requirements:
- Knowledge of Business Intelligence
- Good knowledge of at least one of the following data integration tools - Informatica Powercenter, Talend DI, Informatica Cloud
- Knowledge of SQL
- Excellent English and communication skills
- Intelligent, quick to learn new technologies
- Track record of accomplishment and effectiveness with handling customers and managing complex data management needs
Role : Talend developer
Location : Coimbatore
Experience : 4+Years
Skills : Talend, any DB
Notice period : Immediate to 15 Days
- Expertise in designing and implementing enterprise scale database (OLTP) and Data warehouse solutions.
- Hands on experience in implementing Azure SQL Database, Azure SQL Date warehouse (Azure Synapse Analytics) and big data processing using Azure Databricks and Azure HD Insight.
- Expert in writing T-SQL programming for complex stored procedures, functions, views and query optimization.
- Should be aware of Database development for both on-premise and SAAS Applications using SQL Server and PostgreSQL.
- Experience in ETL and ELT implementations using Azure Data Factory V2 and SSIS.
- Experience and expertise in building machine learning models using Logistic and linear regression, Decision tree and Random forest Algorithms.
- PolyBase queries for exporting and importing data into Azure Data Lake.
- Building data models both tabular and multidimensional using SQL Server data tools.
- Writing data preparation, cleaning and processing steps using Python, SCALA, and R.
- Programming experience using python libraries NumPy, Pandas and Matplotlib.
- Implementing NOSQL databases and writing queries using cypher.
- Designing end user visualizations using Power BI, QlikView and Tableau.
- Experience working with all versions of SQL Server 2005/2008/2008R2/2012/2014/2016/2017/2019
- Experience using the expression languages MDX and DAX.
- Experience in migrating on-premise SQL server database to Microsoft Azure.
- Hands on experience in using Azure blob storage, Azure Data Lake Storage Gen1 and Azure Data Lake Storage Gen2.
- Performance tuning complex SQL queries, hands on experience using SQL Extended events.
- Data modeling using Power BI for Adhoc reporting.
- Raw data load automation using T-SQL and SSIS
- Expert in migrating existing on-premise database to SQL Azure.
- Experience in using U-SQL for Azure Data Lake Analytics.
- Hands on experience in generating SSRS reports using MDX.
- Experience in designing predictive models using Python and SQL Server.
- Developing machine learning models using Azure Databricks and SQL Server
![skill icon](/_next/image?url=https%3A%2F%2Fcdn.cutshort.io%2Fpublic%2Fimages%2Fskill_icons%2Fscala.png&w=32&q=75)
bachelor’s degree or equivalent experience
● Knowledge of database fundamentals and fluency in advanced SQL, including concepts
such as windowing functions
● Knowledge of popular scripting languages for data processing such as Python, as well as
familiarity with common frameworks such as Pandas
● Experience building streaming ETL pipelines with tools such as Apache Flink, Apache
Beam, Google Cloud Dataflow, DBT and equivalents
● Experience building batch ETL pipelines with tools such as Apache Airflow, Spark, DBT, or
custom scripts
● Experience working with messaging systems such as Apache Kafka (and hosted
equivalents such as Amazon MSK), Apache Pulsar
● Familiarity with BI applications such as Tableau, Looker, or Superset
● Hands on coding experience in Java or Scala
Minimum 2 years of work experience on Snowflake and Azure storage.
Minimum 3 years of development experience in ETL Tool Experience.
Strong SQL database skills in other databases like Oracle, SQL Server, DB2 and Teradata
Good to have Hadoop and Spark experience.
Good conceptual knowledge on Data-Warehouse and various methodologies.
Working knowledge in any of the scripting like UNIX / Shell
Good Presentation and communication skills.
Should be flexible with the overlapping working hours.
Should be able to work independently and be proactive.
Good understanding of Agile development cycle.
![skill icon](/_next/image?url=https%3A%2F%2Fcdn.cutshort.io%2Fpublic%2Fimages%2Fskill_icons%2Fdata_analytics.png&w=32&q=75)