Job Description
Job Responsibilities
-
Design and implement robust database solutions including
-
Security, backup and recovery
-
Performance, scalability, monitoring and tuning,
-
Data management and capacity planning,
-
Planning, and implementing failover between database instances.
-
-
Create data architecture strategies for each subject area of the enterprise data model.
-
Communicate plans, status and issues to higher management levels.
-
Collaborate with the business, architects and other IT organizations to plan a data strategy, sharing important information related to database concerns and constrains
-
Produce all project data architecture deliverables..
-
Create and maintain a corporate repository of all data architecture artifacts.
Skills Required:
-
Understanding of data analysis, business principles, and operations
-
Software architecture and design Network design and implementation
-
Data visualization, data migration and data modelling
-
Relational database management systems
-
DBMS software, including SQL Server
-
Database and cloud computing design, architectures and data lakes
-
Information management and data processing on multiple platforms
-
Agile methodologies and enterprise resource planning implementation
-
Demonstrate database technical functionality, such as performance tuning, backup and recovery, monitoring.
-
Excellent skills with advanced features such as database encryption, replication, partitioning, etc.
-
Strong problem solving, organizational and communication skill.
About Product and Service based company
Similar jobs
About Telstra
Telstra is Australia’s leading telecommunications and technology company, with operations in more than 20 countries, including In India where we’re building a new Innovation and Capability Centre (ICC) in Bangalore.
We’re growing, fast, and for you that means many exciting opportunities to develop your career at Telstra. Join us on this exciting journey, and together, we’ll reimagine the future.
Why Telstra?
- We're an iconic Australian company with a rich heritage that's been built over 100 years. Telstra is Australia's leading Telecommunications and Technology Company. We've been operating internationally for more than 70 years.
- International presence spanning over 20 countries.
- We are one of the 20 largest telecommunications providers globally
- At Telstra, the work is complex and stimulating, but with that comes a great sense of achievement. We are shaping the tomorrow's modes of communication with our innovation driven teams.
Telstra offers an opportunity to make a difference to lives of millions of people by providing the choice of flexibility in work and a rewarding career that you will be proud of!
About the team
Being part of Networks & IT means you'll be part of a team that focuses on extending our network superiority to enable the continued execution of our digital strategy.
With us, you'll be working with world-leading technology and change the way we do IT to ensure business needs drive priorities, accelerating our digitisation programme.
Focus of the role
Any new engineer who comes into data chapter would be mostly into developing reusable data processing and storage frameworks that can be used across data platform.
About you
To be successful in the role, you'll bring skills and experience in:-
Essential
- Hands-on experience in Spark Core, Spark SQL, SQL/Hive/Impala, Git/SVN/Any other VCS and Data warehousing
- Skilled in the Hadoop Ecosystem(HDP/Cloudera/MapR/EMR etc)
- Azure data factory/Airflow/control-M/Luigi
- PL/SQL
- Exposure to NOSQL(Hbase/Cassandra/GraphDB(Neo4J)/MongoDB)
- File formats (Parquet/ORC/AVRO/Delta/Hudi etc.)
- Kafka/Kinesis/Eventhub
Highly Desirable
Experience and knowledgeable on the following:
- Spark Streaming
- Cloud exposure (Azure/AWS/GCP)
- Azure data offerings - ADF, ADLS2, Azure Databricks, Azure Synapse, Eventhubs, CosmosDB etc.
- Presto/Athena
- Azure DevOps
- Jenkins/ Bamboo/Any similar build tools
- Power BI
- Prior experience in building or working in team building reusable frameworks,
- Data modelling.
- Data Architecture and design principles. (Delta/Kappa/Lambda architecture)
- Exposure to CI/CD
- Code Quality - Static and Dynamic code scans
- Agile SDLC
If you've got a passion to innovate, succeed as part of a great team, and looking for the next step in your career, we'd welcome you to apply!
___________________________
We’re committed to building a diverse and inclusive workforce in all its forms. We encourage applicants from diverse gender, cultural and linguistic backgrounds and applicants who may be living with a disability. We also offer flexibility in all our roles, to ensure everyone can participate.
To learn more about how we support our people, including accessibility adjustments we can provide you through the recruitment process, visit tel.st/thrive.
- Designing and coding the data warehousing system to desired company specifications
- Conducting preliminary testing of the warehousing environment before data is extracted
- Extracting company data and transferring it into the new warehousing environment
- Testing the new storage system once all the data has been transferred
- Troubleshooting any issues that may arise
- Providing maintenance support
- Consulting with data management teams to get a big-picture idea of the company’s data storage needs
- Presenting the company with warehousing options based on their storage needs
- Experience of 1-3 years in Informatica Power Center
- Excellent knowledge in Oracle database and Pl-SQL such - Stored Procs, Functions, User Defined Functions, table partition, Index, views etc.
- Knowledge of SQL Server database
- Hands on experience in Informatica Power Center and Database performance tuning, optimization including complex Query optimization techniques Understanding of ETL Control Framework
- Experience in UNIX shell/Perl Scripting
- Good communication skills, including the ability to write clearly
- Able to function effectively as a member of a team
- Proactive with respect to personal and technical development
Mandatory Skills required:
1. Ability to translate business requirements into technical requirements for QlikView
2. Perform detailed analysis of source systems and source system data and model that data in
Qlikview
3. Design, develop, and test QlikView scripts to import data from source systems, data feeds, flat files
to create Qlik marts
4. Proficiency with QlikView Scripting, use of complex QlikView functions,
advanced QlikView Expressions, experience with complex data models and optimization of data
model for query performance to create Qlik Marts
5. Architecture optimization ( includes hardware sizing , security setup , performance tuning )
6. Development of Qlikview/ QlikSense
SDLC life cycle work, including analysis, design, development,
enhancement, testing, maintenance and technical support for this
project. Your primary focus is to enhance and maintain the OFSAA
application.
Result Areas
OFSAA Platform: In-depth knowledge of the OFSAA platform, and various
components, namely OFSAA ALM, PFT and FTP modules
Coordinate with client teams, Infra and vendors product issues Monitor client
ticketing tool and maintain service levels OFSAA Incident acceptance from
L1
Troubleshooting/ resolution collaborate with technical teams.
Perform Root Cause Analysis for problem tickets OFSAA Bug fixes
Functional Testing and release
Must be experienced in Informatica PowerCenter
Must have exposure of SQL, PL/ SQL complex queries, Functions, Procedures
Understanding of standard n-tier systems architecture systems processes and
principles
Configuration and operation of Web application servers WebLogic,
WebSphere, Tomcat is an added advantage
Good communicator and ensuring good understanding of business
requirements and able to prepare functional requirement
documents.
Candidate must have
Strong analytical and problem-solving skills
Strong Knowledge in OFSAA ALM, PFT and FTP modules & Oracle Database
using PL_SQL.
Possess strong verbal and written communication skills
5 Knowledge, Skills and Experience
Overall 5+ years of experience in OFSAA
5+ years of hands-on experience in SQL/PL_SQL.
Cross border team management experience.
Strong organizational, team building and leadership qualities.
Strong understanding of SDLC process /Agile Methodology
Strong interpersonal communication skills
Experience of Banking Operations in particular Core Banking, General
Ledger, Loans,
2) Design and Create AAS/OLAP/ Tabular cubes and automate processes for analytical needs. Expertise in developing OLAP cubes and developing complex calculations, Aggregations, implementing a dynamic security model using DAX functions in - Azure Analysis service
3) Writing optimized SQL queries for integration with other applications, Maintain data quality and overseeing database security, Partitions and Index. Extensively used performance monitor/SQL profiler/DMVs to solve deadlocks, to monitor long-running queries and trouble-shoot cubes SQL and T-SQL.
4) Designing database tables and structures, creating views functions and stored procedures.
5) Exposure to PowerBI will be an added advantage.
Office Location: Goregaon Mumbai
Position description: 2+ years of experience in database development.
Primary Responsibilities:
- Understand requirements from front end applications developers
- Write advanced queries, stored procedures, cursors, functions & triggers
- Conduct code reviews
- Work with high-traffic application servers
Required Skills:
Mandatory: Passionate about sports, Problem solving, Team player, Target & Result oriented.
Functional: MSSQL Development - TSQL, Stored Procedures, & Triggers, Advance Queries, Optimization, Indexes, Joins, Database Design, JIRA.
Who we look for
- Strong technical expertise and building ability
- Ability to envisage how your technical knowledge could be applied outside of academia - with a focus on impact/disruption
- Ability to explain and articulate complex ideas, simply
- Ability to digest difficult questions/information
JD:
Required Skills:
- Intermediate to Expert level hands-on programming using one of programming language- Java or Python or Pyspark or Scala.
- Strong practical knowledge of SQL.
Hands on experience on Spark/SparkSQL - Data Structure and Algorithms
- Hands-on experience as an individual contributor in Design, Development, Testing and Deployment of Big Data technologies based applications
- Experience in Big Data application tools, such as Hadoop, MapReduce, Spark, etc
- Experience on NoSQL Databases like HBase, etc
- Experience with Linux OS environment (Shell script, AWK, SED)
- Intermediate RDBMS skill, able to write SQL query with complex relation on top of big RDMS (100+ table)
We are looking for BE/BTech graduates (2018/2019 pass out) who want to build their career as Data Engineer covering technologies like Hadoop, NoSQL, RDBMS, Spark, Kafka, Hive, ETL, MDM & Data Quality. You should be willing to learn, explore, experiment, develop POCs/Solutions using these technologies with guidance and support from highly experienced Industry Leaders. You should be passionate about your work and willing to go extra mile to achieve results.
We are looking for candidates who believe in commitment and in building strong relationships. We need people who are passionate about solving problems through software and are flexible.
Required Experience, Skills and Qualifications
Passionate to learn and explore new technologies
Any RDBMS experience (SQL Server/Oracle/MySQL)
Any ETL tool experience (Informatica/Talend/Kettle/SSIS)
Understanding of Big Data technologies
Good Communication Skills
Excellent Mathematical / Logical / Reasoning Skills