Responsibilities:
- Participate and lead Business Requirements gathering, documentation of BRD and overall Solution design.
- Implementation of designed system in NetSuite and support to project team.
- Ensure the best practices implementation based on Industry standard.
- Need to be involved on Reviews, check feasibility of solution and functional fitment as per NetSuite and align customer expectations and NetSuite capabilities to ensure smooth project delivery.
- Prepare test cases of modules implemented, validate the deliverables before Client testing and UAT.
- Follow up with Client and Support for issue resolution.
- Documenting the resolution of issues and sharing with the internal team on ongoing basis.
- Interact with the key end users and business owners to map applications to standard business processes and conduct gap analysis
- Suggest process improvements based on the application capability and industry best practices
- Support all formal documentation of implementation and provide relevant functional inputs to technical team
- Contribute to the training and development of key NetSuite team.
- Support in Pre Sales, demos and estimates as an when required.
- Ensure weekly Client reporting and support to Project Management team.
Qualifications/Skills:
- Bachelor’s degree preferably in Engineering, MCA, MSC, or Accounting.
- Minimum 5 years’ experience in NetSuite with 8 to 10 years of end-to-end in NetSuite Implementation.
- Good knowledge of all NetSuite Modules with strong understanding of functional processes
- NetSuite Foundation / ERP Consultant certification preferred.
- Demonstrated experience in translating customer business requirements into workable business solutions, preferably on SaaS.

About RevGurus Info India
Similar jobs
Job Title : ServiceNow Integration Developer
Experience : 6+ Years
Location : Remote/On-site (as applicable)
Summary :
- We are looking for an experienced ServiceNow Integration Developer to design, develop, and implement workflows, applications, and integrations on the ServiceNow platform.
- The ideal candidate will have hands-on expertise in Service Portal development and system integrations, combined with excellent problem-solving and communication skills.
Key Responsibilities :
- Design and develop workflows, applications, and integrations on the ServiceNow platform.
- Build and customize Service Portal to meet user requirements.
- Collaborate with stakeholders to gather requirements and translate business needs into technical solutions.
- Develop, test, and deploy scripts, workflows, and integrations.
- Create prototypes and mockups to demonstrate design solutions.
- Troubleshoot and resolve issues, ensuring seamless operation of ServiceNow solutions.
- Stay updated with the latest ServiceNow features, tools, and best practices.
Required Skills and Experience :
- ServiceNow Expertise: Strong experience in design, development, and customization of the ServiceNow platform.
- Integration Proficiency: Hands-on experience with platform integrations and Scripted REST APIs.
- Workflow Development: Proficient in using tools such as Workflow, Workflow Editor, and automation scripts.
- Service Portal Development: Proven experience in building and customizing Service Portals.
- Architecture Knowledge: Strong understanding of ServiceNow architecture, data models, and best practices.
- Problem-Solving: Excellent troubleshooting skills to resolve platform-related issues.
- Collaboration: Strong communication and interpersonal skills to work effectively with cross-functional teams.
Educational Qualifications :
- Bachelor’s degree in Computer Science, Information Technology, or a related field.
Why Join Us?
- Competitive compensation.
- Opportunity to work on cutting-edge ServiceNow solutions.
- Collaborative and growth-oriented work environment.

Senior Generative AI Engineer
Job Id: QX016
About Us:
The QX impact was launched with a mission to make AI accessible and affordable and deliver AI Products/Solutions at scale for the enterprises by bringing the power of Data, AI, and Engineering to drive digital transformation. We believe without insights; businesses will continue to face challenges to better understand their customers and even lose them.
Secondly, without insights businesses won't’ be able to deliver differentiated products/services; and finally, without insights, businesses can’t achieve a new level of “Operational Excellence” is crucial to remain competitive, meeting rising customer expectations, expanding markets, and digitalization.
Job Summary:
We seek a highly experienced Senior Generative AI Engineer who focus on the development, implementation, and engineering of Gen AI applications using the latest LLMs and frameworks. This role requires hands-on expertise in Python programming, cloud platforms, and advanced AI techniques, along with additional skills in front-end technologies, data modernization, and API integration. The Senior Gen AI engineer will be responsible for building applications from the ground up, ensuring robust, scalable, and efficient solutions.
Responsibilities:
· Build GenAI solutions such as virtual assistant, data augmentation, automated insights and predictive analytics
· Design, develop, and fine-tune generative AI models (GANs, VAEs, Transformers).
· Handle data preprocessing, augmentation, and synthetic data generation.
· Work with NLP, text generation, and contextual comprehension tasks.
· Develop backend services using Python or .NET for LLM-powered applications.
· Build and deploy AI applications on cloud platforms (Azure, AWS, GCP).
· Optimize AI pipelines and ensure scalability.
· Stay updated with advancements in AI and ML.
Skills & Requirements:
- Strong knowledge of machine learning, deep learning, and NLP.
- Proficiency in Python, TensorFlow, PyTorch, and Keras.
- Experience with cloud services, containerization (Docker, Kubernetes), and AI model deployment.
- Understanding of LLMs, embeddings, and retrieval-augmented generation (RAG).
- Ability to work independently and as part of a team.
- Bachelor’s degree in Computer Science, Mathematics, Engineering, or a related field.
- 6+ years of experience in Gen AI, or related roles.
- Experience with AI/ML model integration into data pipelines.
Core Competencies for Generative AI Engineers:
1. Programming & Software Development
a. Python – Proficiency in writing efficient and scalable code with strong knowledge with NumPy, Pandas, TensorFlow, PyTorch and Scikit-learn.
b. LLM Frameworks – Experience with Hugging Face Transformers, LangChain, OpenAI API, and similar tools for building and deploying large language models.
c. API integration such as FastAPI, Flask, RESTful API, WebSockets or Django.
d. Knowledge of Version Control, containerization, CI/CD Pipelines and Unit Testing.
2. Vector Database & Cloud AI Solutions
a. Pinecone, FAISS, ChromaDB, Neo4j
b. Azure Redis/ Cognitive Search
c. Azure OpenAI Service
d. Azure ML Studio Models
e. AWS (Relevant Services)
3. Data Engineering & Processing
- Handling large-scale structured & unstructured datasets.
- Proficiency in SQL, NoSQL (PostgreSQL, MongoDB), Spark, and Hadoop.
- Feature engineering and data augmentation techniques.
4. NLP & Computer Vision
- NLP: Tokenization, embeddings (Word2Vec, BERT, T5, LLaMA).
- CV: Image generation using GANs, VAEs, Stable Diffusion.
- Document Embedding – Experience with vector databases (FAISS, ChromaDB, Pinecone) and embedding models (BGE, OpenAI, SentenceTransformers).
- Text Summarization – Knowledge of extractive and abstractive summarization techniques using models like T5, BART, and Pegasus.
- Named Entity Recognition (NER) – Experience in fine-tuning NER models and using pre-trained models from SpaCy, NLTK, or Hugging Face.
- Document Parsing & Classification – Hands-on experience with OCR (Tesseract, Azure Form Recognizer), NLP-based document classifiers, and tools like LayoutLM, PDFMiner.
5. Model Deployment & Optimization
- Model compression (quantization, pruning, distillation).
- Deployment using Azure CI/CD, ONNX, TensorRT, OpenVINO on AWS, GCP.
- Model monitoring (MLflow, Weights & Biases) and automated workflows (Azure Pipeline).
- API integration with front-end applications.
6. AI Ethics & Responsible AI
- Bias detection, interpretability (SHAP, LIME), and security (adversarial attacks).
7. Mathematics & Statistics
- Linear Algebra, Probability, and Optimization (Gradient Descent, Regularization, etc.).
8. Machine Learning & Deep Learning
a. Expertise in supervised, unsupervised, and reinforcement learning.
a. Proficiency in TensorFlow, PyTorch, and JAX.
b. Experience with Transformers, GANs, VAEs, Diffusion Models, and LLMs (GPT, BERT, T5).
Personal Attributes:
- Strong problem-solving skills with a passion for data architecture.
- Excellent communication skills with the ability to explain complex data concepts to non-technical stakeholders.
- Highly collaborative, capable of working with cross-functional teams.
- Ability to thrive in a fast-paced, agile environment while managing multiple priorities effectively.
Why Join Us?
- Be part of a collaborative and agile team driving cutting-edge AI and data engineering solutions.
- Work on impactful projects that make a difference across industries.
- Opportunities for professional growth and continuous learning.
- Competitive salary and benefits package.
Ready to make an impact? Apply today and become part of the QX impact team!


Dot Net Developer
Location: Remote
Experience required: 2-15 years
Roles and Responsibilities:
In this role, you will be responsible for designing and developing enterprise-grade applications. As a member of the team, you will be expected to take ownership of individual platform components.
Key responsibilities include:
-
Developing back-end web app applications.
-
Creating servers and databases for functionality.
-
Test and debug various applications
-
Review and refactor code
-
Designing and developing APIs.
-
Strong understanding of relational databases
Basic Technical Qualifications
-
2+ years experience in building out enterprise-grade applications
Must have strong hands-on development experience in the following:
-
.net framework (c#)
-
.net core
-
Exposing and consuming JSON-based RESTful services
-
MongoDB, SQL
-
Oops Concepts
-
Troubleshooting abilities
-
Experience with unit testing
-
Familiarity with Agile methodologies
-
Experience in the end-to-end release of highly reliable applications including development
-
and testing
-
Bachelor’s degree or equivalent in Computer Science/Software Engineering (or related fields).
Soft Skills
-
Strong work ethic and dedication
-
An aptitude and interest in both technology and business
-
Excellent written and verbal communication skills are a must.
-
Highly motivated and interested in following up on detailed business or technical issues
-
and understanding the functional and technical impact of any change
-
Willingness to take initiative and work independently
LogiNext is looking for an experienced and dedicated Presales Solutions Manager to become a part of our fast growing team. A tech enthusiast that you are, you will be passionate about conveying value to our clients while closing huge and complex deals. With a deep understanding of Enterprise SaaS application, you will offer solutions where LogiNext products can be put to best use helping client’s achieve visionary objectives.
You should be a reliable technical advisor to clients and overcome complicated implementation challenges. You will lead the solution design throughout the sales cycle and deliver ideas and solutions to clients to change their customer’s experience. You should have an intense desire to set the vision transforming business goals into exciting and actionable propositions.
Responsibilities :
Develop and convey out-of-the-box solutions, along with solution ideas to key decision makers to focus on their business issues Understand and articulate the benefits of LogiNext products to educate enterprise clients on the value proposition of our products Conduct on-field and on-site Proof of Concepts wherever required to assist clients validate technical requirements Associate with Account managers and Business Development managers to pilot complex deal cycles with C-level executives Work cross-functionally with business development, marketing, technology and finance team to ensure the timely and successful delivery of the solutions according to customer needs and objectives Recognize product and technology disparity with customers and present a point of view to product and leadership teams Run change management programs to drive change on the ground by working with client’s on-field workforce at the warehouse or remote branches Perform business analytics on the client’s business KPIs and present to the client’s management Generate leads by reaching out to prospective clients across countries and time-zones Assist business development manager in achieving the sales targets by conducting pilots, showcasing results and building conviction at the client
Requirements :
Bachelor’s or Master’s Degree in Computer Science, Information Technology, Business Management, Statistics or related field 4 to 7 years of experience in technical pre-sales or sales preferably in SaaS companies Solid know-how of Enterprise SaaS products Ability to multi task at a high degree with passion, strong initiatives and positive attitude Advanced skill-set for driving system integrations, gathering requirements, documenting RFI/RFPs and cross-functional project management Proficient in Excel and SQL Excellent written and verbal communication skills and the ability to persuade, influence, negotiate and make formal presentations in meetings and training environments Confident and dynamic working persona, which can bring fun to the team, and sense of humor, is a plus Strong organizational skills, judgment and decision-making skills, and the ability to work under continual deadline pressure Willingness to travel around 100% of the time within the city and between cities

Role : Web Scraping Engineer
Experience : 2 to 3 Years
Job Location : Chennai
About OJ Commerce:
OJ Commerce (OJC), a rapidly expanding and profitable online retailer, is headquartered in Florida, USA, with a fully-functional office in Chennai, India. We deliver exceptional value to our customers by harnessing cutting-edge technology, fostering innovation, and establishing strategic brand partnerships to enable a seamless, enjoyable shopping experience featuring high-quality products at unbeatable prices. Our advanced, data-driven system streamlines operations with minimal human intervention.
Our extensive product portfolio encompasses over a million SKUs and more than 2,500 brands across eight primary categories. With a robust presence on major platforms such as Amazon, Walmart, Wayfair, Home Depot, and eBay, we directly serve consumers in the United States.
As we continue to forge new partner relationships, our flagship website, www.ojcommerce.com, has rapidly emerged as a top-performing e-commerce channel, catering to millions of customers annually.
Job Summary:
We are seeking a Web Scraping Engineer and Data Extraction Specialist who will play a crucial role in our data acquisition and management processes. The ideal candidate will be proficient in developing and maintaining efficient web crawlers capable of extracting data from large websites and storing it in a database. Strong expertise in Python, web crawling, and data extraction, along with familiarity with popular crawling tools and modules, is essential. Additionally, the candidate should demonstrate the ability to effectively utilize API tools for testing and retrieving data from various sources. Join our team and contribute to our data-driven success!
Responsibilities:
- Develop and maintain web crawlers in Python.
- Crawl large websites and extract data.
- Store data in a database.
- Analyze and report on data.
- Work with other engineers to develop and improve our web crawling infrastructure.
- Stay up to date on the latest crawling tools and techniques.
Required Skills and Qualifications:
- Bachelor's degree in computer science or a related field.
- 2-3 years of experience with Python and web crawling.
- Familiarity with tools / modules such as
- Scrapy, Selenium, Requests, Beautiful Soup etc.
- API tools such as Postman or equivalent.
- Working knowledge of SQL.
- Experience with web crawling and data extraction.
- Strong problem-solving and analytical skills.
- Ability to work independently and as part of a team.
- Excellent communication and documentation skills.
What we Offer
• Competitive salary
• Medical Benefits/Accident Cover
• Flexi Office Working Hours
• Fast paced start up
MS D365 Solution Architect
Must have at least one Microsoft certification relating to D365
Minimum 3-5 Years’ experience
Job Description
As a Part of a Collaborative and Entrepreneurial Team, You Will:
Act as the primary Dynamics and power platform Subject Matter Expert on client
engagements
Interact with clients to understand business requirements.
Demonstrate MSCRM solution capabilities to the client.
Conduct business process analysis and create Fit/Gap report.
Create Solution design to address client business, interface, and performance
requirements.
Advise on complex MS Dynamics CRM business cases and propose comprehensive
solutions based on MS CRM, 3rd parties and customizations.
Create functional requirement and functional design for customizations.
Create estimates for implementation tasks.
Learn our client's business, their organization, systems, challenges, and goals.
Craft the technical vision by mapping requirements to technical capabilities.
Build prototypes and proofs of concept (POC) to validate technical and
solution decisions.
Lead client facing and internal training (technology training and/or application specific
training)
Contribute to continuously improving IP and demoware through creating reusable
templates, sharing of knowledge, and building reusable demos.
Be an innovator who can create new solutions using out-of-the-box thinking.
Assist sales and presales teams to prepare proposals, participate on client presentations
and support business development and the sale of professional services when
necessary.
Estimate and design staffing for proposed solutions.
Your Technical & Non-Technical Experience Includes:
3-5 years overall MS Dynamics 365 experience
Experience as an MS Dynamics CRM solution architect in a client-facing role for MS
Dynamics top tier or similar consulting organization is highly desirable.
Hands-on experience in designing, configuring, or administering MS Dynamics CRM
required.
Proven experience and knowledge on power automate.
Proven experience on customising and configuring Microsoft Dynamics 365 CRM
Proven experience on implementing C# plugins and java scripts.
Proven experience on creating Microsoft flows.
Hands on experience in using XRM toolbox.
Proven experience in integrating websites or any landing pages to dynamics 365
Expert level – D365 Field Service and sales strongly desired
Knowledge of Remote Assist/Guides or other similar Mixed Reality applications
is beneficial.
Understanding of IoT/Connected Field Service
Experience of implementing integrated Dynamics 365 solutions
o Document Core Pack
o Dynamics Portals
Knowledge of how and when to use plugins, workflow, and JavaScript assemblies.
Modern Workplace (SharePoint / Teams) experience would be ideal.
Proven track record operating as a player– accountable for driving specific deals.
Superior communication and personal leadership skills in a high growth environment
Must haves:
1. He should have dynamics 365 certifications. Look for profiles who has as many certifications
as possible.
2. Knowledge on dynamics 365 sales, marketing, field service, portals.
3. Knowledge on customisation and configuration of CRM
4. Able to design, develop and deploy CRM solutions.
5. Knowledge on plugins. JavaScript development and CRM integrations
6. Knowledge on power platform and power automate.
7. Should give update on new releases pro-actively
8. Should have experience in facing the clients.
9. Create Solution design to address client business, interface, and performance requirements

Have you streamed a program on Disney+, watched your favorite binge-worthy series on Peacock or cheered your favorite team on during the World Cup from one of the 20 top streaming platforms around the globe? If the answer is yes, you’ve already benefitted from Conviva technology, helping the world’s leading streaming publishers deliver exceptional streaming experiences and grow their businesses.
Conviva is the only global streaming analytics platform for big data that collects, standardizes, and puts trillions of cross-screen, streaming data points in context, in real time. The Conviva platform provides comprehensive, continuous, census-level measurement through real-time, server side sessionization at unprecedented scale. If this sounds important, it is! We measure a global footprint of more than 500 million unique viewers in 180 countries watching 220 billion streams per year across 3 billion applications streaming on devices. With Conviva, customers get a unique level of actionability and scale from continuous streaming measurement insights and benchmarking across every stream, every screen, every second.
As Conviva is expanding, we are building products providing deep insights into end user experience for our customers.
Platform and TLB Team
The vision for the TLB team is to build data processing software that works on terabytes of streaming data in real time. Engineer the next-gen Spark-like system for in-memory computation of large time-series dataset’s – both Spark-like backend infra and library based programming model. Build horizontally and vertically scalable system that analyses trillions of events per day within sub second latencies. Utilize the latest and greatest of big data technologies to build solutions for use-cases across multiple verticals. Lead technology innovation and advancement that will have big business impact for years to come. Be part of a worldwide team building software using the latest technologies and the best of software development tools and processes.
What You’ll Do
This is an individual contributor position. Expectations will be on the below lines:
- Design, build and maintain the stream processing, and time-series analysis system which is at the heart of Conviva's products
- Responsible for the architecture of the Conviva platform
- Build features, enhancements, new services, and bug fixing in Scala and Java on a Jenkins-based pipeline to be deployed as Docker containers on Kubernetes
- Own the entire lifecycle of your microservice including early specs, design, technology choice, development, unit-testing, integration-testing, documentation, deployment, troubleshooting, enhancements etc.
- Lead a team to develop a feature or parts of the product
- Adhere to the Agile model of software development to plan, estimate, and ship per business priority
What you need to succeed
- 9+ years of work experience in software development of data processing products.
- Engineering degree in software or equivalent from a premier institute.
- Excellent knowledge of fundamentals of Computer Science like algorithms and data structures. Hands-on with functional programming and know-how of its concepts
- Excellent programming and debugging skills on the JVM. Proficient in writing code in Scala/Java/Rust/Haskell/Erlang that is reliable, maintainable, secure, and performant
- Experience with big data technologies like Spark, Flink, Kafka, Druid, HDFS, etc.
- Deep understanding of distributed systems concepts and scalability challenges including multi-threading, concurrency, sharding, partitioning, etc.
- Experience/knowledge of Akka/Lagom framework and/or stream processing technologies like RxJava or Project Reactor will be a big plus. Knowledge of design patterns like event-streaming, CQRS and DDD to build large microservice architectures will be a big plus
- Excellent communication skills. Willingness to work under pressure. Hunger to learn and succeed. Comfortable with ambiguity. Comfortable with complexity
Underpinning the Conviva platform is a rich history of innovation. More than 60 patents represent award-winning technologies and standards, including first-of-its kind-innovations like time-state analytics and AI-automated data modeling, that surfaces actionable insights. By understanding real-world human experiences and having the ability to act within seconds of observation, our customers can solve business-critical issues and focus on growing their businesses ahead of the competition. Examples of the brands Conviva has helped fuel streaming growth for include DAZN, Disney+, HBO, Hulu, NBCUniversal, Paramount+, Peacock, Sky, Sling TV, Univision, and Warner Bros Discovery.
Privately held, Conviva is headquartered in Silicon Valley, California with offices and people around the globe. For more information, visit us at www.conviva.com. Join us to help extend our leadership position in big data streaming analytics to new audiences and markets!
As Conviva is expanding, we are building products providing deep insights into end-user experience for our customers.
Platform and TLB Team
The vision for the TLB team is to build data processing software that works on terabytes of streaming data in real-time. Engineer the next-gen Spark-like system for in-memory computation of large time-series datasets – both Spark-like backend infra and library-based programming model. Build a horizontally and vertically scalable system that analyses trillions of events per day within sub-second latencies. Utilize the latest and greatest big data technologies to build solutions for use cases across multiple verticals. Lead technology innovation and advancement that will have a big business impact for years to come. Be part of a worldwide team building software using the latest technologies and the best of software development tools and processes.
What You’ll Do
This is an individual contributor position. Expectations will be on the below lines:
- Design, build and maintain the stream processing, and time-series analysis system which is at the heart of Conviva’s products
- Responsible for the architecture of the Conviva platform
- Build features, enhancements, new services, and bug fixing in Scala and Java on a Jenkins-based pipeline to be deployed as Docker containers on Kubernetes
- Own the entire lifecycle of your microservice including early specs, design, technology choice, development, unit-testing, integration-testing, documentation, deployment, troubleshooting, enhancements, etc.
- Lead a team to develop a feature or parts of a product
- Adhere to the Agile model of software development to plan, estimate, and ship per business priority
What you need to succeed
- 5+ years of work experience in software development of data processing products.
- Engineering degree in software or equivalent from a premier institute.
- Excellent knowledge of fundamentals of Computer Science like algorithms and data structures. Hands-on with functional programming and know-how of its concepts
- Excellent programming and debugging skills on the JVM. Proficient in writing code in Scala/Java/Rust/Haskell/Erlang that is reliable, maintainable, secure, and performant
- Experience with big data technologies like Spark, Flink, Kafka, Druid, HDFS, etc.
- Deep understanding of distributed systems concepts and scalability challenges including multi-threading, concurrency, sharding, partitioning, etc.
- Experience/knowledge of Akka/Lagom framework and/or stream processing technologies like RxJava or Project Reactor will be a big plus. Knowledge of design patterns like event-streaming, CQRS and DDD to build large microservice architectures will be a big plus
- Excellent communication skills. Willingness to work under pressure. Hunger to learn and succeed. Comfortable with ambiguity. Comfortable with complexity
Underpinning the Conviva platform is a rich history of innovation. More than 60 patents represent award-winning technologies and standards, including first-of-its kind-innovations like time-state analytics and AI-automated data modeling, that surfaces actionable insights. By understanding real-world human experiences and having the ability to act within seconds of observation, our customers can solve business-critical issues and focus on growing their business ahead of the competition. Examples of the brands Conviva has helped fuel streaming growth for include: DAZN, Disney+, HBO, Hulu, NBCUniversal, Paramount+, Peacock, Sky, Sling TV, Univision and Warner Bros Discovery.
Privately held, Conviva is headquartered in Silicon Valley, California with offices and people around the globe. For more information, visit us at www.conviva.com. Join us to help extend our leadership position in big data streaming analytics to new audiences and markets!

The ideal candidate is a self-motivated, multi-tasker, and demonstrated team-player. You will be a lead developer responsible for the development of new software products and enhancements to existing products. You should excel in working with large-scale applications and frameworks and have outstanding communication and aleadership skills.
Responsibilities
- Writing clean, high-quality, high-performance, maintainable code
- Develop and support software including applications, database integration, interfaces, and new functionality enhancements
- Coordinate cross-functionally to insure project meets business objectives and compliance standards
- Support test and deployment of new products and features
- Participate in code reviews
Qualifications
- 5+ years of relevant work experience
- Mandatory experience in building scalable microservices on nodejs platforms
- Expertise in Object Oriented Design, Database Design, Service architecture
- Experience with Agile or Scrum software development methodologies
- Ability to multi-task, organize, and prioritize work

