
About Niki.ai
About
Connect with the team
Similar jobs

7+ years of experience in Python Development
Good experience in Microservices and APIs development.
Must have exposure to large scale data
Good to have Gen AI experience
Code versioning and collaboration. (Git)
Knowledge for Libraries for extracting data from websites.
Knowledge of SQL and NoSQL databases
Familiarity with RESTful APIs
Familiarity with Cloud (Azure /AWS) technologies
About Wissen Technology:
• The Wissen Group was founded in the year 2000. Wissen Technology, a part of Wissen Group, was established in the year 2015.
• Wissen Technology is a specialized technology company that delivers high-end consulting for organizations in the Banking & Finance, Telecom, and Healthcare domains. We help clients build world class products.
• Our workforce consists of 550+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world.
• Wissen Technology has grown its revenues by 400% in these five years without any external funding or investments.
• Globally present with offices US, India, UK, Australia, Mexico, and Canada.
• We offer an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation.
• Wissen Technology has been certified as a Great Place to Work®.
• Wissen Technology has been voted as the Top 20 AI/ML vendor by CIO Insider in 2020.
• Over the years, Wissen Group has successfully delivered $650 million worth of projects for more than 20 of the Fortune 500 companies.
We have served client across sectors like Banking, Telecom, Healthcare, Manufacturing, and Energy. They include likes of Morgan Stanley, MSCI, StateStreet, Flipkart, Swiggy, Trafigura, GE to name a few.
Website : www.wissen.com
Experience Required: 1-3 Years
No. of vacancies: 4
Job Type: Full Time
Vacancy Role: WFO
Job Category: Development
Job Description
We are seeking a Data Scientist with strong expertise in data analysis, machine learning, and visualization. The ideal candidate should be proficient in Python, Pandas, and Matplotlib, with experience in building and optimizing data-driven models. Some experience in Natural Language Processing (NLP) and Named Entity Recognition (NER) models would be a plus.
Roles & Responsibilities
- Analyze and process large datasets using Python and Pandas.
- Develop and optimize machine learning models for predictive analytics.
- Create data visualizations using Matplotlib and Seaborn to support decision-making.
- Perform data cleaning, feature engineering, and statistical analysis.
- Work with structured and unstructured data to extract meaningful insights.
- Implement and fine-tune NER models for specific use cases (if required).
- Collaborate with cross-functional teams to drive data-driven solutions.
Qualifications
- 1+ years of professional experience
- Strong proficiency in Python and data science libraries (Pandas, NumPy, Scikit-learn, etc.).
- Experience in data analysis, statistical modeling, and machine learning.
- Hands-on expertise in data visualization using Matplotlib and Seaborn.
- Understanding of SQL and database querying.
- Familiarity with NLP techniques and NER models is a plus.
- Strong problem-solving and analytical skills.
We're looking for a talented and client-focused Salesforce Consultant to lead the design, implementation, and optimization of Salesforce solutions tailored to our clients’ unique needs.
Role Overview:
As a Salesforce Consultant, you will work directly with clients to understand their business requirements, configure and customize Salesforce solutions, and ensure successful deployment. Your deep Salesforce expertise and strategic mindset will be instrumental in delivering high-impact CRM initiatives.
Key Responsibilities:
- Lead end-to-end Salesforce implementations and enhancements for clients across industries.
- Conduct discovery workshops to gather and document business requirements.
- Design scalable Salesforce solutions, ensuring alignment with client goals and best practices.
- Configure Salesforce objects, fields, workflows, reports, dashboards, and automation rules.
- Collaborate with internal stakeholders, developers, and third-party vendors to deliver high-quality solutions.
- Provide user training, documentation, and post-deployment support.
- Stay current with Salesforce releases and recommend system improvements.
Requirements
- 3+ years of experience working as a Salesforce Consultant, Admin, or Business Analyst.
- Strong understanding of Salesforce Sales Cloud, Service Cloud, and CRM workflows.
- Hands-on experience with configuration (custom objects, flows, validation rules, etc.).
- Proven ability to translate business needs into technical requirements and Salesforce configurations.
- Excellent communication, stakeholder management, and problem-solving skills.
The Knowledge Graph Architect is responsible for designing, developing, and implementing knowledge graph technologies to enhance organizational data understanding and decision-making capabilities. This role involves collaborating with data scientists, engineers, and business stakeholders to integrate complex data into accessible and insightful knowledge graphs.
Work you’ll do
1. Design and develop scalable and efficient knowledge graph architectures.
2. Implement knowledge graph integration with existing data systems and business processes.
3. Lead the ontology design, data modeling, and schema development for knowledge representation.
4. Collaborate with IT and business units to understand data needs and deliver comprehensive knowledge graph solutions.
5. Manage the lifecycle of knowledge graph data, including quality, consistency, and updates.
6. Provide expertise in semantic technologies and machine learning to enhance data interconnectivity and retrieval.
7. Develop and maintain documentation and specifications for system architectures and designs.
8. Stay updated with the latest industry trends in knowledge graph technologies and data management.
The Team
Innovation & Technology anticipates how technology will shape the future and begins building future capabilities and practices today. I&T drives the Ideation, Incubation and scale of hybrid businesses and tech enabled offerings at prioritized offering portfolio and industry interactions.
It drives cultural and capability transformation from solely services – based businesses to hybrid businesses. While others bet on the future, I&T builds it with you.
I&T encompasses many teams—dreamers, designers, builders—and partners with the business to bring a unique POV to deliver services and products for clients.
Qualifications and Experience
Required:
1. Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field.
2. 6-10 years of professional experience in data engineering with Proven experience in designing and implementing knowledge graph systems.
3. Strong understanding of semantic web technologies (RDF, SPARQL, GraphQL,OWL, etc.).
4. Experience with graph databases such as Neo4j, Amazon Neptune, or others.
5. Proficiency in programming languages relevant to data management (e.g., Python, Java, Javascript).
6. Excellent analytical and problem-solving abilities.
7. Strong communication and collaboration skills to work effectively across teams.
Preferred:
1. Experience with machine learning and natural language processing.
2. Experience with Industry 4.0 technologies and principles
3. Prior exposure to cloud platforms and services like AWS, Azure, or Google Cloud.
4. Experience with containerization technologies like Docker and Kubernetes
- 15+ years of Experience in OFSAA Financial Solution Data Foundation and OFSAA Regulatory reporting solutions
- Expert in enterprise solution architecture and design
- Strong understanding in the OFSAA Data Model, Dimension Management and Enterprise Data Warehouse Knowledge.
- Strong Understanding of OFSAA instrument balances reconciliation with General Ledger Summary Level balances
- Experience in defining and build the OFSAA data architecture and sourcing strategy to ensure data accuracy, integrity and quality.
- Understanding of Banking treasury products, US Fed regulatory etc.
- Strong understanding of data lineage. building
- Strong in OFSAA Data Management Tools Knowledge (F2T/T2T/PLT/SCD’s).
- Experience in Business rules configurations in OFSAA framework
- Strong Experience in deploying OFSAA platform (OFSAAI – OFSAA Infrastructure) and installation of OFSAA application - preferably OFSAA 8.x onwards.
- Conduct competitive analyses of other websites that operate in the same space
- Prepare customer analyses based on our target demographic and initial transactions
- Participate in creating a content development strategy
- Coordinate with UI design team on issues like navigation, page routing, product page design and more
- Track usability goals and prepare reports for senior management
- Develop mockups for our development and design team
- Conduct usability tests on each independent type of page on the website and create a report showcasing your findings



What you will be doing:
As a part of the Global Credit Risk and Data Analytics team, this person will be responsible for carrying out analytical initiatives which will be as follows: -
- Dive into the data and identify patterns
- Development of end-to-end Credit models and credit policy for our existing credit products
- Leverage alternate data to develop best-in-class underwriting models
- Working on Big Data to develop risk analytical solutions
- Development of Fraud models and fraud rule engine
- Collaborate with various stakeholders (e.g. tech, product) to understand and design best solutions which can be implemented
- Working on cutting-edge techniques e.g. machine learning and deep learning models
Example of projects done in past:
- Lazypay Credit Risk model using CatBoost modelling technique ; end-to-end pipeline for feature engineering and model deployment in production using Python
- Fraud model development, deployment and rules for EMEA region
Basic Requirements:
- 1-3 years of work experience as a Data scientist (in Credit domain)
- 2016 or 2017 batch from a premium college (e.g B.Tech. from IITs, NITs, Economics from DSE/ISI etc)
- Strong problem solving and understand and execute complex analysis
- Experience in at least one of the languages - R/Python/SAS and SQL
- Experience in in Credit industry (Fintech/bank)
- Familiarity with the best practices of Data Science
Add-on Skills :
- Experience in working with big data
- Solid coding practices
- Passion for building new tools/algorithms
- Experience in developing Machine Learning models

