Technogen India PvtLtd
https://technogeninc.com/Jobs at Technogen India PvtLtd
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Brief write up on how you could be a great fit for the role :
What your days and weeks will include.
- Agile scrum meetings, as appropriate, to track and drive individual and team accountability.
- Ad hoc collaboration sessions to define and adopt standards and best practices.
- A weekly ½ hour team huddle to discuss initiatives across the entire team.
- A weekly ½ hour team training session focused on comprehension of our team delivery model
- Conduct technical troubleshooting, maintenance, and operational support for production code
- Code! Test! Code! Test!
The skills that make you a great fit.
- Minimum of 5 years to 10 Years of experience as Eagle Investment Accounting (BNY Mellon product) as a Domain expert/QA Tester.
- Expert level of knowledge in ‘Capital Markets’ or ‘Investment of financial services.
- Must be self-aware, resilient, possess strong communications skills, and can lead.
- Strong experience release testing in an agile model
- Strong experience preparing test plans, test cases, and test summary reports.
- Strong experience in execution of smoke and regression test cases
- Strong experience utilizing and managing large and complex data sets.
- Experience with SQL Server and Oracle desired
- Good to Have knowledge on DevOps delivery model including roles and technologies desired
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
- Analyze client requirements and design customized Saviynt IGA solutions to meet their identity management needs
- Configure and implement Saviynt IGA platform features, including role-based access control, user provisioning, de-provisioning, certification campaigns, and entitlement management
- Collaborate with clients and internal stakeholders to understand business processes and translate them into effective IGA policies
- Assist in the integration of Saviynt IGA with existing identity and access management (IAM) systems, directories, and applications
- Conduct testing and troubleshooting to ensure the accuracy and effectiveness of the implemented IGA solution
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Daily and monthly responsibilities
- Review and coordinate with business application teams on data delivery requirements.
- Develop estimation and proposed delivery schedules in coordination with development team.
- Develop sourcing and data delivery designs.
- Review data model, metadata and delivery criteria for solution.
- Review and coordinate with team on test criteria and performance of testing.
- Contribute to the design, development and completion of project deliverables.
- Complete in-depth data analysis and contribution to strategic efforts
- Complete understanding of how we manage data with focus on improvement of how data is sourced and managed across multiple business areas.
Basic Qualifications
- Bachelor’s degree.
- 5+ years of data analysis working with business data initiatives.
- Knowledge of Structured Query Language (SQL) and use in data access and analysis.
- Proficient in data management including data analytical capability.
- Excellent verbal and written communications also high attention to detail.
- Experience with Python.
- Presentation skills in demonstrating system design and data analysis solutions.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
.Net Core is preferred along with SQL or very strong .Net developer with SQL
The job requires the resource to do documentation, SQL support, development with .Net,integration,interaction with end user- mix of all activities
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
We are looking for a Full Stack Developer-REMOTE Position
Strong knowledge of HTML, CSS, JavaScript, and JavaScript frameworks (e.g., React, Angular, Vue)
• Experience with backend technologies such as Node.js, DynamoDB, and PostgreSQL.
• Experience with version control systems (e.g., Git)
• Knowledge of web standards and accessibility guidelines
• Familiarity with server-side rendering and SEO best practices
Experience with AWS services, including AWS Lambda, API Gateway, and DynamoDB.
• Experience with Dev Ops- Infrastructure as Code, CI / CD, Test & Deployment Automation.
• Experience writing and maintaining a test suite throughout a project's lifecycle.
• Familiarity with Web Accessibility standards and technology
• Experience architecting and building Graph QL APIs and REST-full services.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
- Data Steward :
Data Steward will collaborate and work closely within the group software engineering and business division. Data Steward has overall accountability for the group's / Divisions overall data and reporting posture by responsibly managing data assets, data lineage, and data access, supporting sound data analysis. This role requires focus on data strategy, execution, and support for projects, programs, application enhancements, and production data fixes. Makes well-thought-out decisions on complex or ambiguous data issues and establishes the data stewardship and information management strategy and direction for the group. Effectively communicates to individuals at various levels of the technical and business communities. This individual will become part of the corporate Data Quality and Data management/entity resolution team supporting various systems across the board.
Primary Responsibilities:
- Responsible for data quality and data accuracy across all group/division delivery initiatives.
- Responsible for data analysis, data profiling, data modeling, and data mapping capabilities.
- Responsible for reviewing and governing data queries and DML.
- Accountable for the assessment, delivery, quality, accuracy, and tracking of any production data fixes.
- Accountable for the performance, quality, and alignment to requirements for all data query design and development.
- Responsible for defining standards and best practices for data analysis, modeling, and queries.
- Responsible for understanding end-to-end data flows and identifying data dependencies in support of delivery, release, and change management.
- Responsible for the development and maintenance of an enterprise data dictionary that is aligned to data assets and the business glossary for the group responsible for the definition and maintenance of the group's data landscape including overlays with the technology landscape, end-to-end data flow/transformations, and data lineage.
- Responsible for rationalizing the group's reporting posture through the definition and maintenance of a reporting strategy and roadmap.
- Partners with the data governance team to ensure data solutions adhere to the organization’s data principles and guidelines.
- Owns group's data assets including reports, data warehouse, etc.
- Understand customer business use cases and be able to translate them to technical specifications and vision on how to implement a solution.
- Accountable for defining the performance tuning needs for all group data assets and managing the implementation of those requirements within the context of group initiatives as well as steady-state production.
- Partners with others in test data management and masking strategies and the creation of a reusable test data repository.
- Responsible for solving data-related issues and communicating resolutions with other solution domains.
- Actively and consistently support all efforts to simplify and enhance the Clinical Trial Predication use cases.
- Apply knowledge in analytic and statistical algorithms to help customers explore methods to improve their business.
- Contribute toward analytical research projects through all stages including concept formulation, determination of appropriate statistical methodology, data manipulation, research evaluation, and final research report.
- Visualize and report data findings creatively in a variety of visual formats that appropriately provide insight to the stakeholders.
- Achieve defined project goals within customer deadlines; proactively communicate status and escalate issues as needed.
Additional Responsibilities:
- Strong understanding of the Software Development Life Cycle (SDLC) with Agile Methodologies
- Knowledge and understanding of industry-standard/best practices requirements gathering methodologies.
- Knowledge and understanding of Information Technology systems and software development.
- Experience with data modeling and test data management tools.
- Experience in the data integration project • Good problem solving & decision-making skills.
- Good communication skills within the team, site, and with the customer
Knowledge, Skills and Abilities
- Technical expertise in data architecture principles and design aspects of various DBMS and reporting concepts.
- Solid understanding of key DBMS platforms like SQL Server, Azure SQL
- Results-oriented, diligent, and works with a sense of urgency. Assertive, responsible for his/her own work (self-directed), have a strong affinity for defining work in deliverables, and be willing to commit to deadlines.
- Experience in MDM tools like MS DQ, SAS DM Studio, Tamr, Profisee, Reltio etc.
- Experience in Report and Dashboard development
- Statistical and Machine Learning models
- Python (sklearn, numpy, pandas, genism)
- Nice to Have:
- 1yr of ETL experience
- Natural Language Processing
- Neural networks and Deep learning
- xperience in keras,tensorflow,spacy, nltk, LightGBM python library
Interaction : Frequently interacts with subordinate supervisors.
Education : Bachelor’s degree, preferably in Computer Science, B.E or other quantitative field related to the area of assignment. Professional certification related to the area of assignment may be required
Experience : 7 years of Pharmaceutical /Biotech/life sciences experience, 5 years of Clinical Trials experience and knowledge, Excellent Documentation, Communication, and Presentation Skills including PowerPoint
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
- Sr. Data Engineer:
Core Skills – Data Engineering, Big Data, Pyspark, Spark SQL and Python
Candidate with prior Palantir Cloud Foundry OR Clinical Trial Data Model background is preferred
Major accountabilities:
- Responsible for Data Engineering, Foundry Data Pipeline Creation, Foundry Analysis & Reporting, Slate Application development, re-usable code development & management and Integrating Internal or External System with Foundry for data ingestion with high quality.
- Have good understanding on Foundry Platform landscape and it’s capabilities
- Performs data analysis required to troubleshoot data related issues and assist in the resolution of data issues.
- Defines company data assets (data models), Pyspark, spark SQL, jobs to populate data models.
- Designs data integrations and data quality framework.
- Design & Implement integration with Internal, External Systems, F1 AWS platform using Foundry Data Connector or Magritte Agent
- Collaboration with data scientists, data analyst and technology teams to document and leverage their understanding of the Foundry integration with different data sources - Actively participate in agile work practices
- Coordinating with Quality Engineer to ensure the all quality controls, naming convention & best practices have been followed
Desired Candidate Profile :
- Strong data engineering background
- Experience with Clinical Data Model is preferred
- Experience in
- SQL Server ,Postgres, Cassandra, Hadoop, and Spark for distributed data storage and parallel computing
- Java and Groovy for our back-end applications and data integration tools
- Python for data processing and analysis
- Cloud infrastructure based on AWS EC2 and S3
- 7+ years IT experience, 2+ years’ experience in Palantir Foundry Platform, 4+ years’ experience in Big Data platform
- 5+ years of Python and Pyspark development experience
- Strong troubleshooting and problem solving skills
- BTech or master's degree in computer science or a related technical field
- Experience designing, building, and maintaining big data pipelines systems
- Hands-on experience on Palantir Foundry Platform and Foundry custom Apps development
- Able to design and implement data integration between Palantir Foundry and external Apps based on Foundry data connector framework
- Hands-on in programming languages primarily Python, R, Java, Unix shell scripts
- Hand-on experience in AWS / Azure cloud platform and stack
- Strong in API based architecture and concept, able to do quick PoC using API integration and development
- Knowledge of machine learning and AI
- Skill and comfort working in a rapidly changing environment with dynamic objectives and iteration with users.
Demonstrated ability to continuously learn, work independently, and make decisions with minimal supervision
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Similar companies
GenX Technologies
About the company
Genx Technologies is a dynamically well-established Mobile & Desktop Application Development, Web Designing & Development Company. With our strong and adroit team of experienced IT experts having in-depth domain knowledge of diverse verticals and across various development platforms, we have delivered hundreds of projects across the globe.
Jobs
1
SynerGenie Solutions
About the company
Jobs
1
TrueTech Solutions
About the company
Jobs
3
Dev Technosys
About the company
Dev Technosys is a web and mobile application development company that has been providing top-notch IT solutions since its establishment in 2010. With an ISO 9001:2015 certification, the company has a proven track record of delivering successful projects to clients across the globe.
The company specializes in a wide range of services, including web development and designing, mobile app development for Android, iPhone/iPad, hybrid, and IoT platforms, e-commerce solutions, service marketplace development, ERP/CRM solutions, product customization, on-demand solutions, IoT, cloud computing, API generation and integration, custom website development, and much more.
Jobs
0
Met Technologies
About the company
Jobs
1
Global Tech Info Solutions
About the company
Jobs
1
Techno Facts Solutions
About the company
Jobs
0
Tech fi Technologies
About the company
Jobs
1
Technologia group
About the company
Jobs
1