
Key Responsibilities:
- Perform comprehensive Functional and Integration Testing across Oracle modules and connected systems.
- Conduct detailed End-to-End (E2E) Testing to ensure business processes function seamlessly across applications.
- Collaborate with cross-functional teams, including Business Analysts, Developers, and Automation teams, to validate business requirements and deliver high-quality releases.
- Identify, document, and track functional defects, ensuring timely closure and root cause analysis.
- Execute and validate SQL queries for backend data verification and cross-system data consistency checks.
- Participate in regression cycles and support continuous improvement initiatives through data-driven analysis.
Required Skills & Competencies:
- Strong knowledge of Functional Testing processes and methodologies.
- Good to have Oracle fusion knowledge
- Solid understanding of Integration Flows between Oracle and peripheral systems.
- Proven ability in E2E Testing, including scenario design, execution, and defect management.
- Excellent Analytical and Logical Reasoning skills with attention to detail.
- Hands-on experience with SQL for data validation and analysis.
- Effective communication, documentation, and coordination skills.
Preferred Qualifications:
- Exposure to automation-assisted functional testing and cross-platform data validation.
- Experience in identifying test optimization opportunities and improving testing efficiency.

About Wissen Technology
About
The Wissen Group was founded in the year 2000. Wissen Technology, a part of Wissen Group, was established in the year 2015. Wissen Technology is a specialized technology company that delivers high-end consulting for organizations in the Banking & Finance, Telecom, and Healthcare domains.
With offices in US, India, UK, Australia, Mexico, and Canada, we offer an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation.
Leveraging our multi-site operations in the USA and India and availability of world-class infrastructure, we offer a combination of on-site, off-site and offshore service models. Our technical competencies, proactive management approach, proven methodologies, committed support and the ability to quickly react to urgent needs make us a valued partner for any kind of Digital Enablement Services, Managed Services, or Business Services.
We believe that the technology and thought leadership that we command in the industry is the direct result of the kind of people we have been able to attract, to form this organization (you are one of them!).
Our workforce consists of 1000+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like MIT, Wharton, IITs, IIMs, and BITS and with rich work experience in some of the biggest companies in the world.
Wissen Technology has been certified as a Great Place to Work®. The technology and thought leadership that the company commands in the industry is the direct result of the kind of people Wissen has been able to attract. Wissen is committed to providing them the best possible opportunities and careers, which extends to providing the best possible experience and value to our clients.
Connect with the team
Similar jobs
Key Responsibilities: Strong Communication, Basic Knowledge Documentation.
If you're interested, please share your updated resume along with the following details:
Total Experience
Relevant Experience
Current CTC
Expected CTC
Notice Period
Highest Qualification
Current Organization
Current Location
Looking forward to your response!
CORE RESPONSIBILITIES
- Create and manage cloud resources in AWS
- Data ingestion from different data sources which exposes data using different technologies, such as: RDBMS, REST HTTP API, flat files, Streams, and Time series data based on various proprietary systems. Implement data ingestion and processing with the help of Big Data technologies
- Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part of business logic and implement it using the language supported by the base data platform
- Develop automated data quality check to make sure right data enters the platform and verifying the results of the calculations
- Develop an infrastructure to collect, transform, combine and publish/distribute customer data.
- Define process improvement opportunities to optimize data collection, insights and displays.
- Ensure data and results are accessible, scalable, efficient, accurate, complete and flexible
- Identify and interpret trends and patterns from complex data sets
- Construct a framework utilizing data visualization tools and techniques to present consolidated analytical and actionable results to relevant stakeholders.
- Key participant in regular Scrum ceremonies with the agile teams
- Proficient at developing queries, writing reports and presenting findings
- Mentor junior members and bring best industry practices
QUALIFICATIONS
- 5-7+ years’ experience as data engineer in consumer finance or equivalent industry (consumer loans, collections, servicing, optional product, and insurance sales)
- Strong background in math, statistics, computer science, data science or related discipline
- Advanced knowledge one of language: Java, Scala, Python, C#
- Production experience with: HDFS, YARN, Hive, Spark, Kafka, Oozie / Airflow, Amazon Web Services (AWS), Docker / Kubernetes, Snowflake
- Proficient with
- Data mining/programming tools (e.g. SAS, SQL, R, Python)
- Database technologies (e.g. PostgreSQL, Redshift, Snowflake. and Greenplum)
- Data visualization (e.g. Tableau, Looker, MicroStrategy)
- Comfortable learning about and deploying new technologies and tools.
- Organizational skills and the ability to handle multiple projects and priorities simultaneously and meet established deadlines.
- Good written and oral communication skills and ability to present results to non-technical audiences
- Knowledge of business intelligence and analytical tools, technologies and techniques.
Mandatory Requirements
- Experience in AWS Glue
- Experience in Apache Parquet
- Proficient in AWS S3 and data lake
- Knowledge of Snowflake
- Understanding of file-based ingestion best practices.
- Scripting language - Python & pyspark
- 3+ years of experience in Technology.
- A strong product design sense.
- Good experience in working with programming language Golang.
- Understand end-user requirements, formulate use cases and come up with effective solutions.
- Good understanding of REST APIs and the web in general.
Job Title: Senior Full Stack Developer
Employment Type: Full-time
Years of Experience: 4-7 years
Number of Positions: 3
Education: B. Tech, B.E, MSc-IT, MCA degree or equivalent
Work Location: Hyderabad
Mode of Work: Work From Office
Notice Period: Immediate (within 15 days)
On-site Feasibility: Yes
Interview Rounds:
- L1 - HRBP (Virtual Screening)
- L2 - Tech Lead (Virtual)
- L3 - Sr Tech Lead/CEO (Virtual)
- L4 - Managerial Round (CTO) (Virtual)
Interview Panel Availability: Preferably Weekends or after 6 pm weekdays
Working Hours: 11:30 am to 8:30 pm (Monday - Friday)
Hiring for: Product/Client
Role Reporting to: Tech Lead
Required Skills:
- 4+ years of development experience in .NET.
- Proficiency in JavaScript or TypeScript.
- Proficiency in C#, .NET, .NET Core, Entity Framework.
- Solid understanding of software development principles (SOLID).
- Expert in Azure/AWS.
- Understanding of Design Patterns.
- Excellent communication skills.
Good to Have:
- Knowledge of SQL databases.
- Certification in Azure/AWS.
- Strong knowledge of Vue.js.
- 2+ years of development experience in Vue.js or Angular (Version 10 and above) framework.
Roles & Responsibilities:
- Develop new applications including Web APIs, Webjobs, Functions, etc.
- Develop front-end applications using Micro Frontend Architecture.
- Write Unit Tests and Integration Tests to ensure software quality.
- Troubleshoot and debug applications.
- Mentor and train other team members.
- Maintain application performance by identifying and resolving production and development issues, installing updates and patches, and conducting maintenance tasks.
- Provide support for applications by developing utilities, addressing inquiries, and resolving issues.
- Stay updated with the latest technologies and trends by participating in educational opportunities and obtaining relevant certifications.
- Contribute to the achievement of organizational goals by completing assigned tasks and projects effectively.
Employee Benefits:
- Health Insurance (covering spouse and 2 children)
- Provident Fund (PF)
We are hiring for Export logistics Manager for Karnal who shall be responsible to handle end to end logistics cycle
Enterprise Minds, with core focus on engineering products, automation and intelligence, partners customers on the trajectory towards increasing outcomes, relevance, and growth.
Harnessing the power of Data and the forces that define AI, Machine Learning and Data Science, we believe in institutionalizing go-to-market models and not just explore possibilities.
We believe in a customer-centric ethic without and people-centric paradigm within. With a strong sense of community, ownership, and collaboration our people work in a spirit of co-creation, co-innovation, and co-development to engineer next-generation software products with the help of accelerators.
Through Communities we connect and attract talent that shares skills and expertise. Through Innovation Labs and global design studios we deliver creative solutions.
We create vertical isolated pods which has narrow but deep focus. We also create horizontal pods to collaborate and deliver sustainable outcomes.
We follow Agile methodologies to fail fast and deliver scalable and modular solutions. We are constantly self-asses and realign to work with each customer in the most impactful manner.
Pre-requisites for the Role
- Job ID-EMBD0120PS
- Primary skill: GCP DATA ENGINEER, BIGQUERY, ETL
- Secondary skill: HADOOP, PYTHON, SPARK
- Years of Experience: 5-8Years
- Location: Remote
Budget- Open
NP- Immediate
GCP DATA ENGINEER
Position description
- Designing and implementing software systems
- Creating systems for collecting data and for processing that data
- Using Extract Transform Load operations (the ETL process)
- Creating data architectures that meet the requirements of the business
- Researching new methods of obtaining valuable data and improving its quality
- Creating structured data solutions using various programming languages and tools
- Mining data from multiple areas to construct efficient business models
- Collaborating with data analysts, data scientists, and other teams.
Candidate profile
- Bachelor’s or master’s degree in information systems/engineering, computer science and management or related.
- 5-8 years professional experience as Big Data Engineer
- Proficiency in modelling and maintaining Data Lakes with PySpark – preferred basis.
- Experience with Big Data technologies (e.g., Databricks)
- Ability to model and optimize workflows GCP.
- Experience with Streaming Analytics services (e.g., Kafka, Grafana)
- Analytical, innovative and solution-oriented mindset
- Teamwork, strong communication and interpersonal skills
- Rigor and organizational skills
- Fluency in English (spoken and written).
Its a Full Time Position with our client
Date of Joining: Immediate Joiners (within 7-10 Days)
Work Location: Hyderabad
Experience Level : 6-10 Years
Mandatory Skills: WebAPI, Angular 4+ , MVC and SQL
Job description:
•Expertise in Web API is most preferable.
•Good experience needed in Angular 4+ implementation.
•Must have very good exposure and experience working with C#,ASP.Net, MVC, Entity Framework, Web Service, Java Script, jQuery and SQL Server.
•Strong Knowledge of software implementation best practices.
•Strong experience in debugging and working with n-tier architecture (UI, Business layer and Data Access layer) along some experience with service oriented architectures(SOA)
•Ability to design and optimize SQL server stored procedures.
•Solid understanding of object oriented programming (OOP).
•Experience using version control (Git/Subversion)
•Experience using Jira and Confluence
•Develop and enhance new and existing software applications
•Mentor and train other team members
•Gain knowledge of the Energy Industry
•Provide documentation and training on the solution
Who are we?
We are a venture capital-backed software development company headquartered in Canada. We develop in-house products to disrupt one industry at a time and partner as a technology service provider to selected startups.
Who are you?
Experience in writing applications using Nodejs including Express or similar.
Must be good in MySQL or one of the databases such as Mongo.
Proficient in Javascript and good experience and knowledge of open source tools, frameworks, broader cutting-edge technologies around server-side development.
Excellent data structure, algorithm, and problem-solving skills.
Created and consumed various APIs in the past.
Should be an active contributor to developer communities like Stack Overflow, GitHub, Google Developer Groups (GDGs).
Customer-focused, react well to changes, work with teams, and able to multi-task.
Must be a proven performer and team player that enjoys challenging assignments in a high-energy, fast-growing, and start-up workplace.
Must be a self-starter who can work well with minimal guidance and in a fluid environment.
Some of the technologies we use are:
NodeJS
ExpressJS
Angular 9
AWS
Github
PSQL, MongoDB
Selenium
Responsibilities:
*Approaching corporate clients and retail clients for event
*Understanding the complete requirements
*Sending proposal & quotations
*Getting the deal signed
*Dealing with guests over the telephone
*Putting conference dates in the diary
*Negotiate the rates
*Ensuring marketing promotions run at the right time








