11+ OFSAA Jobs in Hyderabad | OFSAA Job openings in Hyderabad
Apply to 11+ OFSAA Jobs in Hyderabad on CutShort.io. Explore the latest OFSAA Job opportunities across top companies like Google, Amazon & Adobe.
- 15+ years of Experience in OFSAA Financial Solution Data Foundation and OFSAA Regulatory reporting solutions
- Expert in enterprise solution architecture and design
- Strong understanding in the OFSAA Data Model, Dimension Management and Enterprise Data Warehouse Knowledge.
- Strong Understanding of OFSAA instrument balances reconciliation with General Ledger Summary Level balances
- Experience in defining and build the OFSAA data architecture and sourcing strategy to ensure data accuracy, integrity and quality.
- Understanding of Banking treasury products, US Fed regulatory etc.
- Strong understanding of data lineage. building
- Strong in OFSAA Data Management Tools Knowledge (F2T/T2T/PLT/SCD’s).
- Experience in Business rules configurations in OFSAA framework
- Strong Experience in deploying OFSAA platform (OFSAAI – OFSAA Infrastructure) and installation of OFSAA application - preferably OFSAA 8.x onwards.
* Candidates must have Real Estate Knowledge,
* In-house drafting of all Documents MOU, Agreement to Sell, Sale Deeds, payment collections, and the bank loan process.
* Should have a minimum experience of 2 to 4 years in CRM and managing teams.
* handled Real Estate CRM.
Preferred candidate profile
* Candidate must have Real Estate Knowledge with min 2 - 4 years of experience.
* In-house drafting of all Documents MOU, Agreement to Sell, Sale Deeds, payment collections, and the bank loan process.
* Collections, preparation documentation, Bank docs.
Publicis Sapient Overview:
The Senior Associate People Senior Associate L1 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution
.
Job Summary:
As Senior Associate L1 in Data Engineering, you will do technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution
The role requires a hands-on technologist who has strong programming background like Java / Scala / Python, should have experience in Data Ingestion, Integration and data Wrangling, Computation, Analytics pipelines and exposure to Hadoop ecosystem components. Having hands-on knowledge on at least one of AWS, GCP, Azure cloud platforms will be preferable.
Role & Responsibilities:
Job Title: Senior Associate L1 – Data Engineering
Your role is focused on Design, Development and delivery of solutions involving:
• Data Ingestion, Integration and Transformation
• Data Storage and Computation Frameworks, Performance Optimizations
• Analytics & Visualizations
• Infrastructure & Cloud Computing
• Data Management Platforms
• Build functionality for data ingestion from multiple heterogeneous sources in batch & real-time
• Build functionality for data analytics, search and aggregation
Experience Guidelines:
Mandatory Experience and Competencies:
# Competency
1.Overall 3.5+ years of IT experience with 1.5+ years in Data related technologies
2.Minimum 1.5 years of experience in Big Data technologies
3.Hands-on experience with the Hadoop stack – HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in building end to end data pipeline. Working knowledge on real-time data pipelines is added advantage.
4.Strong experience in at least of the programming language Java, Scala, Python. Java preferable
5.Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc
Preferred Experience and Knowledge (Good to Have):
# Competency
1.Good knowledge of traditional ETL tools (Informatica, Talend, etc) and database technologies (Oracle, MySQL, SQL Server, Postgres) with hands on experience
2.Knowledge on data governance processes (security, lineage, catalog) and tools like Collibra, Alation etc
3.Knowledge on distributed messaging frameworks like ActiveMQ / RabbiMQ / Solace, search & indexing and Micro services architectures
4.Performance tuning and optimization of data pipelines
5.CI/CD – Infra provisioning on cloud, auto build & deployment pipelines, code quality
6.Working knowledge with data platform related services on at least 1 cloud platform, IAM and data security
7.Cloud data specialty and other related Big data technology certifications
Job Title: Senior Associate L1 – Data Engineering
Personal Attributes:
• Strong written and verbal communication skills
• Articulation skills
• Good team player
• Self-starter who requires minimal oversight
• Ability to prioritize and manage multiple tasks
• Process orientation and the ability to define and set up processes
System Engineer (ECU)
Location: Hyderabad
Fulltime
Your tasks as Specialist System Engineer ECU(m/f):
- Analysis of the requirements for the electronics control of occupant safety systems in alignment to the customer specifications
- Definition of the System Requirements for the embedded system
- Ensuring quality requirements according to ASPICE and ASIL (ISO 26262)
- Definition of the design and architecture of the ECU system taking into account the latest technologies and continuous improvement ideas using SYSML
- Coordination and functional leadership of the project team from hardware and software development
Your profile as Specialist Systems Engineer ECU(m/f):
- Successfully completed engineering studies in electronics, embedded system, or software
- Several years of professional experience in the development of control units within the automotive industry
- Experience in product development according to ASPICE and ISO 26262
- Assertiveness, target-oriented and structured working methods, ability to work in a team
- Willingness of intercultural work environment.
--
Regards
Qcentrio I
Key Skills
- Must have experience building apps using fluter, good knowledge on dart. Preferably built and deployed couple of apps both on iOS and Android
- Good knowledge in integrating or developing app functions using firebase
- Must have ported at least 2 applications on iOS and Android platforms. Exposure to the entire build piepline is a must.
- Has good knowledge on how native android and iOS apps work. Must have experience building iOS native apps
- Has an understanding on Computer vision and basic deep learning concepts. Any prototypes of proof of concepts are a big plus.
What they need to do?
- Code - write or learn to write clean code, understand design patterns and develop a quick turnaround time to ship updates
- UI/UX - understand design guidlines and implement them in the project
- Communication - communicate project goals clearly to the team. Understand and convert project requirements in to actionable steps
- Documentation - maintain documentation to support product development
What is homeground?
- HomeGround is an AI platform that helps aspiring cricketers and coaches level up thier training. We help our users get an equal opportunity to play, train for and achieve their goal by providing professional level training analytics right on the smartphone, no special equipment or sensors required.
- HomeGround is one of the Top 10 early stage sports tech startup selected by Startupbootcamp Australia
- Our team shares experience in product growth, marketing and deep learning. We are passionate about our work and are committed 100% to our startup.
We at Datametica Solutions Private Limited are looking for SQL Engineers who have a passion for cloud with knowledge of different on-premise and cloud Data implementation in the field of Big Data and Analytics including and not limiting to Teradata, Netezza, Exadata, Oracle, Cloudera, Hortonworks and alike.
Ideal candidates should have technical experience in migrations and the ability to help customers get value from Datametica's tools and accelerators.
Job Description
Experience : 4-10 years
Location : Pune
Mandatory Skills -
- Strong in ETL/SQL development
- Strong Data Warehousing skills
- Hands-on experience working with Unix/Linux
- Development experience in Enterprise Data warehouse projects
- Good to have experience working with Python, shell scripting
Opportunities -
- Selected candidates will be provided training opportunities on one or more of the following: Google Cloud, AWS, DevOps Tools, Big Data technologies like Hadoop, Pig, Hive, Spark, Sqoop, Flume and Kafka
- Would get chance to be part of the enterprise-grade implementation of Cloud and Big Data systems
- Will play an active role in setting up the Modern data platform based on Cloud and Big Data
- Would be part of teams with rich experience in various aspects of distributed systems and computing
About Us!
A global Leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.
We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.
Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.
We have our own products!
Eagle – Data warehouse Assessment & Migration Planning Product
Raven – Automated Workload Conversion Product
Pelican - Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.
Why join us!
Datametica is a place to innovate, bring new ideas to live and learn new things. We believe in building a culture of innovation, growth and belonging. Our people and their dedication over these years are the key factors in achieving our success.
Benefits we Provide!
Working with Highly Technical and Passionate, mission-driven people
Subsidized Meals & Snacks
Flexible Schedule
Approachable leadership
Access to various learning tools and programs
Pet Friendly
Certification Reimbursement Policy
Check out more about us on our website below!
http://www.datametica.com/">www.datametica.com
We have a requirement for Collibra Developer
Experience required- 5-12 yrs
Having experience in Data Governence , Data Quality management





