10+ Data modeling Jobs in Hyderabad | Data modeling Job openings in Hyderabad
Apply to 10+ Data modeling Jobs in Hyderabad on CutShort.io. Explore the latest Data modeling Job opportunities across top companies like Google, Amazon & Adobe.
The Sr. Analytics Engineer would provide technical expertise in needs identification, data modeling, data movement, and transformation mapping (source to target), automation and testing strategies, translating business needs into technical solutions with adherence to established data guidelines and approaches from a business unit or project perspective.
Understands and leverages best-fit technologies (e.g., traditional star schema structures, cloud, Hadoop, NoSQL, etc.) and approaches to address business and environmental challenges.
Provides data understanding and coordinates data-related activities with other data management groups such as master data management, data governance, and metadata management.
Actively participates with other consultants in problem-solving and approach development.
Responsibilities :
Provide a consultative approach with business users, asking questions to understand the business need and deriving the data flow, conceptual, logical, and physical data models based on those needs.
Perform data analysis to validate data models and to confirm the ability to meet business needs.
Assist with and support setting the data architecture direction, ensuring data architecture deliverables are developed, ensuring compliance to standards and guidelines, implementing the data architecture, and supporting technical developers at a project or business unit level.
Coordinate and consult with the Data Architect, project manager, client business staff, client technical staff and project developers in data architecture best practices and anything else that is data related at the project or business unit levels.
Work closely with Business Analysts and Solution Architects to design the data model satisfying the business needs and adhering to Enterprise Architecture.
Coordinate with Data Architects, Program Managers and participate in recurring meetings.
Help and mentor team members to understand the data model and subject areas.
Ensure that the team adheres to best practices and guidelines.
Requirements :
- Strong working knowledge of at least 3 years of Spark, Java/Scala/Pyspark, Kafka, Git, Unix / Linux, and ETL pipeline designing.
- Experience with Spark optimization/tuning/resource allocations
- Excellent understanding of IN memory distributed computing frameworks like Spark and its parameter tuning, writing optimized workflow sequences.
- Experience of relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., Redshift, Bigquery, Cassandra, etc).
- Familiarity with Docker, Kubernetes, Azure Data Lake/Blob storage, AWS S3, Google Cloud storage, etc.
- Have a deep understanding of the various stacks and components of the Big Data ecosystem.
- Hands-on experience with Python is a huge plus
Responsibilities -
- Collaborate with the development team to understand data requirements and identify potential scalability issues.
- Design, develop, and implement scalable data pipelines and ETL processes to ingest, process, and analyse large - volumes of data from various sources.
- Optimize data models and database schemas to improve query performance and reduce latency.
- Monitor and troubleshoot the performance of our Cassandra database on Azure Cosmos DB, identifying bottlenecks and implementing optimizations as needed.
- Work with cross-functional teams to ensure data quality, integrity, and security.
- Stay up to date with emerging technologies and best practices in data engineering and distributed systems.
Qualifications & Requirements -
- Proven experience as a Data Engineer or similar role, with a focus on designing and optimizing large-scale data systems.
- Strong proficiency in working with NoSQL databases, particularly Cassandra.
- Experience with cloud-based data platforms, preferably Azure Cosmos DB.
- Solid understanding of Distributed Systems, Data modelling, Data Warehouse Designing, and ETL Processes.
- Detailed understanding of Software Development Life Cycle (SDLC) is required.
- Good to have knowledge on any visualization tool like Power BI, Tableau.
- Good to have knowledge on SAP landscape (SAP ECC, SLT, BW, HANA etc).
- Good to have experience on Data Migration Project.
- Knowledge of Supply Chain domain would be a plus.
- Familiarity with software architecture (data structures, data schemas, etc.)
- Familiarity with Python programming language is a plus.
- The ability to work in a dynamic, fast-paced, work environment.
- A passion for data and information with strong analytical, problem solving, and organizational skills.
- Self-motivated with the ability to work under minimal direction.
- Strong communication and collaboration skills, with the ability to work effectively in a cross-functional team environment.
Position Description:
TTEC Digital is looking for enthusiastic Developers for Genesys Contact Center products and custom developed Cloud solutions. As a Developer, you will function as an active member of the Development team in the following phases of a project’s lifecycle: Design, Build, Deploy, Accept, web and windows services, API’s and applications that integrate with our customers back end CRM systems, databases, and external 3rd party API’s.
Responsibilities:
- Works with customers as needed to translate design requirements into application solutions, ensuring the requirements are met according to the team’s and practice area’s standards and best practices.
- Communicates with project manager/client to identify application requirements.
- Ensures applications meet the standards and requirements of both the client and project manager.
- Conducts tests of the application for functionality, reliability and stabilization.
- Deploys/implements the application to the client.
- Maintains and supports existing applications by fixing problems, addressing issues and determining the need for enhancements.
- Demonstrates concern for meeting client needs in a manner that provides satisfaction and excellent results for the client, leading to additional opportunities within the client account.
- Performs all tasks within the budget and on time while meeting all necessary project requirements. Communicates regularly if budget and/or scope changes.
- Demonstrate professionalism and leadership in representing the Company to customers and vendors.
- Core PureConnect handler development & maintenance.
- Monitor and respond to system errors. Participate in on-call rotation.
- Follow-up on and resolve outstanding issues in a timely manner.
- Update customer to reflect changes in system configuration as needed.
- Understand system hardware/software to be able to identify problems and provide a remedy.
- Handle TAC/Engineering escalations as directed by the team lead or team manager.
Requirements
- Bachelor’s degree in computer science, business, or related area.
- 3+ years of relevant experience and proven ability as a software developer.
- Experience with the Microsoft development platform.
- Experience with .NET Framework.
- Professional experience with integration services including XML, SOAP, REST, TCP/IP, JavaScript, and HTML.
- Deep Understanding of application architecture.
- Familiarity in data modeling and architecture.
- Deep expertise and familiarity with the Pure Cloud development platform.
We offer an outstanding career development opportunity, a competitive salary along with full comprehensive benefits. We are looking for individuals with a team player attitude, strong drive for career growth and a passion for excellence in client support, delivery, and satisfaction.
Location - Remote till covid ( Hyderabad Stacknexus office post covid)
Experience - 5 - 7 years
Skills Required - Should have hands-on experience in Azure Data Modelling, Python, SQL and Azure Data bricks.
Notice period - Immediate to 15 days
o Strong Python development skills, with 7+ yrs. experience with SQL.
o A bachelor or master’s degree in Computer Science or related areas
o 5+ years of experience in data integration and pipeline development
o Experience in Implementing Databricks Delta lake and data lake
o Expertise designing and implementing data pipelines using modern data engineering approach and tools: SQL, Python, Delta Lake, Databricks, Snowflake Spark
o Experience in working with multiple file formats (Parque, Avro, Delta Lake) & API
o experience with AWS Cloud on data integration with S3.
o Hands on Development experience with Python and/or Scala.
o Experience with SQL and NoSQL databases.
o Experience in using data modeling techniques and tools (focused on Dimensional design)
o Experience with micro-service architecture using Docker and Kubernetes
o Have experience working with one or more of the public cloud providers i.e. AWS, Azure or GCP
o Experience in effectively presenting and summarizing complex data to diverse audiences through visualizations and other means
o Excellent verbal and written communications skills and strong leadership capabilities
Skills:
ML
MOdelling
Python
SQL
Azure Data Lake, dataFactory, Databricks, Delta Lake
We at Datametica Solutions Private Limited are looking for an SQL Lead / Architect who has a passion for the cloud with knowledge of different on-premises and cloud Data implementation in the field of Big Data and Analytics including and not limiting to Teradata, Netezza, Exadata, Oracle, Cloudera, Hortonworks and alike.
Ideal candidates should have technical experience in migrations and the ability to help customers get value from Datametica's tools and accelerators.
Job Description :
Experience: 6+ Years
Work Location: Pune / Hyderabad
Technical Skills :
- Good programming experience as an Oracle PL/SQL, MySQL, and SQL Server Developer
- Knowledge of database performance tuning techniques
- Rich experience in a database development
- Experience in Designing and Implementation Business Applications using the Oracle Relational Database Management System
- Experience in developing complex database objects like Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL
Required Candidate Profile :
- Excellent communication, interpersonal, analytical skills and strong ability to drive teams
- Analyzes data requirements and data dictionary for moderate to complex projects • Leads data model related analysis discussions while collaborating with Application Development teams, Business Analysts, and Data Analysts during joint requirements analysis sessions
- Translate business requirements into technical specifications with an emphasis on highly available and scalable global solutions
- Stakeholder management and client engagement skills
- Strong communication skills (written and verbal)
About Us!
A global leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.
We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.
Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.
We have our own products!
Eagle Data warehouse Assessment & Migration Planning Product
Raven Automated Workload Conversion Product
Pelican Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.
Why join us!
Datametica is a place to innovate, bring new ideas to live, and learn new things. We believe in building a culture of innovation, growth, and belonging. Our people and their dedication over these years are the key factors in achieving our success.
Benefits we Provide!
Working with Highly Technical and Passionate, mission-driven people
Subsidized Meals & Snacks
Flexible Schedule
Approachable leadership
Access to various learning tools and programs
Pet Friendly
Certification Reimbursement Policy
Check out more about us on our website below!
www.datametica.com
at Aganitha Cognitive Solutions
As a Lead Solutions Architect at Aganitha, you will:
* Engage and co-innovate with customers in BioPharma R&D
* Design and oversee implementation of solutions for BioPharma R&D * Manage Engineering teams using Agile methodologies
* Enhance reuse with platforms, frameworks and libraries
Applying candidates must have demonstrated expertise in the following areas:
1. App dev with modern tech stacks of Python, ReactJS, and fit for purpose database technologies
2. Big data engineering with distributed computing frameworks
3. Data modeling in scientific domains, preferably in one or more of: Genomics, Proteomics, Antibody engineering, Biological/Chemical synthesis and formulation, Clinical trials management
4. Cloud and DevOps automation
5. Machine learning and AI (Deep learning)
Qentelli is seeking a Solution Architect to untangle and redesign a huge granny old monolithic legacy system. Interesting part is that the new system should be commissioned module by module and legacy system should phase off accordingly. So your design will have a cutting edge future state and a transition state to get there. Implementation now is all Microsoft tech stack and will continue to be on newer Microsoft tech stack. Also there is a critical component of API management to be introduced into the solution. Performance and scalability will be at the center of your solution architecture. Data modelling is one thing that is of super high importance to know.
You’ll have a distributed team with onshore in the US and offshore in India. As a Solution Architect, you should be able to wear multiple hats of working with client on solutioning and getting it implemented by engineering and infrastructure teams that are both onshore and offshore. Right candidate will be awesome at fleshing out and documenting every finer detail of the solution, elaborate at communicating with your teams, disciplined at getting it implemented and passionate for client success.
TECHNOLOGIES YOU’LL NEED TO KNOW
Greetings from Qentelli Solutions Private Limited!
We are hiring for PostgreSQL Developer
Experience: 4 to 12 years
Job Location: Hyderabad
Job Description:
- Experience in RDBMS (PostgreSQL preferred), Database Backend development, Data Modelling, Performance Tuning, exposure to NoSQL DB, Kubernetes or Cloud (AWS/Azure/GCS)
Skillset for Developer-II:
- Experience on any Big Data Tools (Nifi, Kafka, Spark, sqoop, storm, snowflake), Database Backend development, Python, No SQL DB, API Exposure, cloud or Kubernetes exposure
Skillset for API Developer:
- API Development with extensive knowledge on any RDBMS (preferred PostgreSQL), exposure to cloud or Kubernetes
Skills Required
- Must demonstrate exceptional verbal and written communication skills
- Must demonstrate ability to communicate effectively at all levels of the organization
- Proven ability to design and implement new processes and facilitate user adoption.
- Strong understanding of the platform, with the ability to build custom apps and objects, formula fields, workflows, custom views, and other content of intermediate complexity
- Strong understanding of Salesforce.com best practices and functionality
- Strong data management abilities
- A documented history of successfully driving projects to completion
- A demonstrated ability to understand and articulate complex requirements
- Experience with nonprofit processes preferred
- Previous experience working in a SCRUM or agile environment preferred
Organizational Alignment
- Reports to the Senior Manager of Business Systems
- Supports full Sales and Sales Operations staff