

Technogen India PvtLtd
https://technogeninc.comAbout
Connect with the team
Jobs at Technogen India PvtLtd
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Brief write up on how you could be a great fit for the role :
What your days and weeks will include.
- Agile scrum meetings, as appropriate, to track and drive individual and team accountability.
- Ad hoc collaboration sessions to define and adopt standards and best practices.
- A weekly ½ hour team huddle to discuss initiatives across the entire team.
- A weekly ½ hour team training session focused on comprehension of our team delivery model
- Conduct technical troubleshooting, maintenance, and operational support for production code
- Code! Test! Code! Test!
The skills that make you a great fit.
- Minimum of 5 years to 10 Years of experience as Eagle Investment Accounting (BNY Mellon product) as a Domain expert/QA Tester.
- Expert level of knowledge in ‘Capital Markets’ or ‘Investment of financial services.
- Must be self-aware, resilient, possess strong communications skills, and can lead.
- Strong experience release testing in an agile model
- Strong experience preparing test plans, test cases, and test summary reports.
- Strong experience in execution of smoke and regression test cases
- Strong experience utilizing and managing large and complex data sets.
- Experience with SQL Server and Oracle desired
- Good to Have knowledge on DevOps delivery model including roles and technologies desired
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
- Analyze client requirements and design customized Saviynt IGA solutions to meet their identity management needs
- Configure and implement Saviynt IGA platform features, including role-based access control, user provisioning, de-provisioning, certification campaigns, and entitlement management
- Collaborate with clients and internal stakeholders to understand business processes and translate them into effective IGA policies
- Assist in the integration of Saviynt IGA with existing identity and access management (IAM) systems, directories, and applications
- Conduct testing and troubleshooting to ensure the accuracy and effectiveness of the implemented IGA solution
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Daily and monthly responsibilities
- Review and coordinate with business application teams on data delivery requirements.
- Develop estimation and proposed delivery schedules in coordination with development team.
- Develop sourcing and data delivery designs.
- Review data model, metadata and delivery criteria for solution.
- Review and coordinate with team on test criteria and performance of testing.
- Contribute to the design, development and completion of project deliverables.
- Complete in-depth data analysis and contribution to strategic efforts
- Complete understanding of how we manage data with focus on improvement of how data is sourced and managed across multiple business areas.
Basic Qualifications
- Bachelor’s degree.
- 5+ years of data analysis working with business data initiatives.
- Knowledge of Structured Query Language (SQL) and use in data access and analysis.
- Proficient in data management including data analytical capability.
- Excellent verbal and written communications also high attention to detail.
- Experience with Python.
- Presentation skills in demonstrating system design and data analysis solutions.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
.Net Core is preferred along with SQL or very strong .Net developer with SQL
The job requires the resource to do documentation, SQL support, development with .Net,integration,interaction with end user- mix of all activities
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
We are looking for a Full Stack Developer-REMOTE Position
Strong knowledge of HTML, CSS, JavaScript, and JavaScript frameworks (e.g., React, Angular, Vue)
• Experience with backend technologies such as Node.js, DynamoDB, and PostgreSQL.
• Experience with version control systems (e.g., Git)
• Knowledge of web standards and accessibility guidelines
• Familiarity with server-side rendering and SEO best practices
Experience with AWS services, including AWS Lambda, API Gateway, and DynamoDB.
• Experience with Dev Ops- Infrastructure as Code, CI / CD, Test & Deployment Automation.
• Experience writing and maintaining a test suite throughout a project's lifecycle.
• Familiarity with Web Accessibility standards and technology
• Experience architecting and building Graph QL APIs and REST-full services.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
- Data Steward :
Data Steward will collaborate and work closely within the group software engineering and business division. Data Steward has overall accountability for the group's / Divisions overall data and reporting posture by responsibly managing data assets, data lineage, and data access, supporting sound data analysis. This role requires focus on data strategy, execution, and support for projects, programs, application enhancements, and production data fixes. Makes well-thought-out decisions on complex or ambiguous data issues and establishes the data stewardship and information management strategy and direction for the group. Effectively communicates to individuals at various levels of the technical and business communities. This individual will become part of the corporate Data Quality and Data management/entity resolution team supporting various systems across the board.
Primary Responsibilities:
- Responsible for data quality and data accuracy across all group/division delivery initiatives.
- Responsible for data analysis, data profiling, data modeling, and data mapping capabilities.
- Responsible for reviewing and governing data queries and DML.
- Accountable for the assessment, delivery, quality, accuracy, and tracking of any production data fixes.
- Accountable for the performance, quality, and alignment to requirements for all data query design and development.
- Responsible for defining standards and best practices for data analysis, modeling, and queries.
- Responsible for understanding end-to-end data flows and identifying data dependencies in support of delivery, release, and change management.
- Responsible for the development and maintenance of an enterprise data dictionary that is aligned to data assets and the business glossary for the group responsible for the definition and maintenance of the group's data landscape including overlays with the technology landscape, end-to-end data flow/transformations, and data lineage.
- Responsible for rationalizing the group's reporting posture through the definition and maintenance of a reporting strategy and roadmap.
- Partners with the data governance team to ensure data solutions adhere to the organization’s data principles and guidelines.
- Owns group's data assets including reports, data warehouse, etc.
- Understand customer business use cases and be able to translate them to technical specifications and vision on how to implement a solution.
- Accountable for defining the performance tuning needs for all group data assets and managing the implementation of those requirements within the context of group initiatives as well as steady-state production.
- Partners with others in test data management and masking strategies and the creation of a reusable test data repository.
- Responsible for solving data-related issues and communicating resolutions with other solution domains.
- Actively and consistently support all efforts to simplify and enhance the Clinical Trial Predication use cases.
- Apply knowledge in analytic and statistical algorithms to help customers explore methods to improve their business.
- Contribute toward analytical research projects through all stages including concept formulation, determination of appropriate statistical methodology, data manipulation, research evaluation, and final research report.
- Visualize and report data findings creatively in a variety of visual formats that appropriately provide insight to the stakeholders.
- Achieve defined project goals within customer deadlines; proactively communicate status and escalate issues as needed.
Additional Responsibilities:
- Strong understanding of the Software Development Life Cycle (SDLC) with Agile Methodologies
- Knowledge and understanding of industry-standard/best practices requirements gathering methodologies.
- Knowledge and understanding of Information Technology systems and software development.
- Experience with data modeling and test data management tools.
- Experience in the data integration project • Good problem solving & decision-making skills.
- Good communication skills within the team, site, and with the customer
Knowledge, Skills and Abilities
- Technical expertise in data architecture principles and design aspects of various DBMS and reporting concepts.
- Solid understanding of key DBMS platforms like SQL Server, Azure SQL
- Results-oriented, diligent, and works with a sense of urgency. Assertive, responsible for his/her own work (self-directed), have a strong affinity for defining work in deliverables, and be willing to commit to deadlines.
- Experience in MDM tools like MS DQ, SAS DM Studio, Tamr, Profisee, Reltio etc.
- Experience in Report and Dashboard development
- Statistical and Machine Learning models
- Python (sklearn, numpy, pandas, genism)
- Nice to Have:
- 1yr of ETL experience
- Natural Language Processing
- Neural networks and Deep learning
- xperience in keras,tensorflow,spacy, nltk, LightGBM python library
Interaction : Frequently interacts with subordinate supervisors.
Education : Bachelor’s degree, preferably in Computer Science, B.E or other quantitative field related to the area of assignment. Professional certification related to the area of assignment may be required
Experience : 7 years of Pharmaceutical /Biotech/life sciences experience, 5 years of Clinical Trials experience and knowledge, Excellent Documentation, Communication, and Presentation Skills including PowerPoint
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
- Sr. Data Engineer:
Core Skills – Data Engineering, Big Data, Pyspark, Spark SQL and Python
Candidate with prior Palantir Cloud Foundry OR Clinical Trial Data Model background is preferred
Major accountabilities:
- Responsible for Data Engineering, Foundry Data Pipeline Creation, Foundry Analysis & Reporting, Slate Application development, re-usable code development & management and Integrating Internal or External System with Foundry for data ingestion with high quality.
- Have good understanding on Foundry Platform landscape and it’s capabilities
- Performs data analysis required to troubleshoot data related issues and assist in the resolution of data issues.
- Defines company data assets (data models), Pyspark, spark SQL, jobs to populate data models.
- Designs data integrations and data quality framework.
- Design & Implement integration with Internal, External Systems, F1 AWS platform using Foundry Data Connector or Magritte Agent
- Collaboration with data scientists, data analyst and technology teams to document and leverage their understanding of the Foundry integration with different data sources - Actively participate in agile work practices
- Coordinating with Quality Engineer to ensure the all quality controls, naming convention & best practices have been followed
Desired Candidate Profile :
- Strong data engineering background
- Experience with Clinical Data Model is preferred
- Experience in
- SQL Server ,Postgres, Cassandra, Hadoop, and Spark for distributed data storage and parallel computing
- Java and Groovy for our back-end applications and data integration tools
- Python for data processing and analysis
- Cloud infrastructure based on AWS EC2 and S3
- 7+ years IT experience, 2+ years’ experience in Palantir Foundry Platform, 4+ years’ experience in Big Data platform
- 5+ years of Python and Pyspark development experience
- Strong troubleshooting and problem solving skills
- BTech or master's degree in computer science or a related technical field
- Experience designing, building, and maintaining big data pipelines systems
- Hands-on experience on Palantir Foundry Platform and Foundry custom Apps development
- Able to design and implement data integration between Palantir Foundry and external Apps based on Foundry data connector framework
- Hands-on in programming languages primarily Python, R, Java, Unix shell scripts
- Hand-on experience in AWS / Azure cloud platform and stack
- Strong in API based architecture and concept, able to do quick PoC using API integration and development
- Knowledge of machine learning and AI
- Skill and comfort working in a rapidly changing environment with dynamic objectives and iteration with users.
Demonstrated ability to continuously learn, work independently, and make decisions with minimal supervision
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Similar companies
About the company
Optimo Capital is a newly established NBFC with a mission to serve the underserved MSME businesses with their credit needs in India. Less than 15% of MSMEs have access to formal credit. We aim to bridge this credit gap by employing a phygital model (physical branches + digital decision-making).
Being a technology and data-first company, tech and data enthusiasts play a crucial role in building the tech & infra to help the company thrive.
Jobs
2
About the company
At TheBlueOwls, we are passionate about harnessing the power of data analytics and artificial intelligence to transform businesses and drive innovation. With a team of experts and cutting-edge technology, we help our clients unlock valuable insights from their data and leverage AI solutions to stay ahead in today's competitive landscape.
Our Founder
Our company was founded by Puran Ticku, an ex-Microsoft Architect with over 20+ years of experience in the field of data and digital health. Puran Ticku has a deep understanding of the potential of data analytics and AI and has successfully led transformative solutions for numerous organizations.
Our Expertise
We specialize in providing comprehensive data analytics services, helping businesses make data-driven decisions and uncover hidden patterns and trends. With our advanced AI capabilities, we enable our clients to automate processes, enhance productivity, and gain a competitive edge in their industries.
Our Approach
At TheBlueOwls, we believe that the key to successful data analytics and AI implementation lies in a holistic approach. We work closely with our clients to understand their unique challenges and goals, and tailor our solutions to meet their specific needs. Our team of skilled professionals utilizes state-of-the-art technology and industry best practices to deliver exceptional results.
Our Commitment
We are committed to delivering the highest level of quality and value to our clients. We strive for excellence in every project we undertake, ensuring that our solutions are not only effective, but also scalable and sustainable. With our deep domain expertise and customer-centric approach, we are dedicated to driving success for our clients and helping them achieve their business objectives.
Contact us today to learn more about how TheBlueOwls can empower your organization with data analytics and AI solutions that drive growth and innovation.
Jobs
5
About the company
Jobs
6
About the company
RockED is the premier people development platform for the automotive industry, supporting the entire employee lifecycle from pre-hire and onboarding to upskilling and career transitions. With microlearning content, gamified delivery, and real-time feedback, RockED is educating the automotive
workforce and solving the industry's greatest business challenges.
The RockED Company Inc. is headquartered in Florida. Backed by top industry experts and investors,
we’re a well-funded startup on an exciting growth journey. Our R&D team (Indian entity) is at the core
of all product and technology innovation
Our India R&D team, which is in Bangalore (Office in Church Street), leads all product and technology innovation, and we’re now expanding this team.
AI will play a key role in improving intelligence, personalization, and overall user experience across the platform.
Jobs
4
About the company
Jobs
1
About the company
Apply for internship programs in 2026 at LetsIntern. Work on real industry projects, gain certificates, stipends & job-ready skills.
Jobs
15
About the company
LetsIntern is a trusted student internship platform in India dedicated to bridging the gap between academic learning and real-world industry experience. We provide training cum internship programs designed for students who want practical skills, hands-on project exposure, and career-ready confidence.
Jobs
8
About the company
Jobs
2
About the company
I2B is a venture studio building AI-first consumer and B2B technology companies. We partner with founders to build scalable enterprise platforms.
Jobs
5








