Description
Greetings from Ivan Infotech Pvt. Ltd.!!
Hiring going on for Web Consultant profile in website process.
Responsibilities
To communicate with clients to determine the scope of website development projects, implement SEO strategies to increase traffic flow to websites, and maintain websites.
- To manage customer queries and cold calling for a B2B Website Process in International Market .
- Will be responsible for new business development (Acquiring new business )and maintain relationship with new and existing clients.
- Would be dealing in Web/Mobile App and Digital Marketing Domain.
- Preparing progress updates and documenting website development processes.
Requirements
- Fluency in English is mandatory.
- Should have experience in any outbound sales process
- Basic knowledge in computer operation.
- Learning mindset.
Opening for US Shift
Benefits:
Five days working, saturday & sunday fixed off
Free Drop facility
Fixed Salary with huge incentive
Growth Opportunity

About Ivan Infotech Pvt Ltd
About
Connect with the team
Company social profiles
Similar jobs
WHO WE ARE
We are a team of digital practitioners with roots stretching back to the earliest days of online commerce, who dedicate themselves to serving our client companies.
We’ve seen the advancements first-hand over the last 25 years and believe our experiences allow us to innovate. Utilizing cutting-edge technology and providing bespoke, innovative services, we believe we can help you stay ahead of the curve.
We take a holistic view of digital strategy. Our approach to transformation is based on conscious Intent to delight customers through continuous Insight and creative Innovation with an enduring culture of problem-solving.
We bring every element together to create innovative, high-performing commerce experiences for enterprise B2C, B2B, D2C and Marketplace brands across industries. From mapping out business and functional requirements, to developing the infrastructure to optimize traditionally fragmented processes, we help you create integrated, future-proofed commerce solutions.
WHAT YOU’LL BE DOING
As part of our team, you'll play a key role in building and evolving our Integration Platform as a Service (iPaaS) solution. This platform empowers our clients to seamlessly connect systems, automate workflows, and scale integrations with modern cloud-native tools.
Here’s what your day-to-day will look like:
- Designing and Building Integrations
- Collaborate with clients to understand integration needs and build scalable, event-driven solutions using Apache Kafka, AWS Lambda, API Gateway, and EventBridge.
- Cloud-Native Development
- Develop and deploy microservices and serverless functions using TypeScript (Node.js), hosted on Kubernetes (EKS) and fully integrated with core AWS services like S3, SQS, and SNS.
- Managing Data Pipelines
- Build robust data flows and streaming pipelines using Kafka and NoSQL databases like MongoDB, ensuring high availability and fault tolerance.
- Client Collaboration
- Work directly with customers to gather requirements, design integration patterns, and provide guidance on best practices for cloud-native architectures.
- Driving Platform Evolution
- Contribute to the ongoing improvement of our iPaaS platform—enhancing observability, scaling capabilities, and CI/CD processes using modern DevOps practices.
WHAT WE NEED IN YOU
- Solid Experience in Apache Kafka for data streaming and event-driven systems
- Production experience with Kubernetes (EKS) and containerized deployments
- Deep knowledge of AWS, including S3, EC2, SQS, SNS, EventBridge, Lambda
- Proficient in TypeScript (Node.js environment)
- Experience with MongoDB or other NoSQL databases
- Familiarity with microservices architecture, async messaging, and DevOps practices
- AWS Certification (e.g., Solutions Architect or Developer Associate) is a plus
Qualification
- Graduate - BE / Btech or equivalent.
- 5 to 8 years of experience
- Self motivated and quick learner with excellent problem solving skills.
- A good team player with nice communication skills.
- Energy and real passion to work in a startup environment.
Visit our website - https://www.trikatechnologies.com
About Us
CLOUDSUFI, a Google Cloud Premier Partner, a Data Science and Product Engineering organization building Products and Solutions for Technology and Enterprise industries. We firmly believe in the power of data to transform businesses and make better decisions. We combine unmatched experience in business processes with cutting edge infrastructure and cloud services. We partner with our customers to monetize their data and make enterprise data dance.
Our Values
We are a passionate and empathetic team that prioritizes human values. Our purpose is to elevate the quality of lives for our family, customers, partners and the community.
Equal Opportunity Statement
CLOUDSUFI is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. All qualified candidates receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, and national origin status. We provide equal opportunities in employment, advancement, and all other areas of our workplace. Please explore more at https://www.cloudsufi.com/.
Job Summary:
We are seeking a highly innovative and skilled AI Engineer to join our AI CoE for the Data Integration Project. The ideal candidate will be responsible for designing, developing, and deploying intelligent assets and AI agents that automate and optimize various stages of the data ingestion and integration pipeline. This role requires expertise in machine learning, natural language processing (NLP), knowledge representation, and cloud platform services, with a strong focus on building scalable and accurate AI solutions.
Key Responsibilities:
- LLM-based Auto-schematization: Develop and refine LLM-based models and techniques for automatically inferring schemas from diverse unstructured and semi-structured public datasets and mapping them to a standardized vocabulary.
- Entity Resolution & ID Generation AI: Design and implement AI models for highly accurate entity resolution, matching new entities with existing IDs and generating unique, standardized IDs for newly identified entities.
- Automated Data Profiling & Schema Detection: Develop AI/ML accelerators for automated data profiling, pattern detection, and schema detection to understand data structure and quality at scale.
- Anomaly Detection & Smart Imputation: Create AI-powered solutions for identifying outliers, inconsistencies, and corrupt records, and for intelligently filling missing values using machine learning algorithms.
- Multilingual Data Integration AI: Develop AI assets for accurately interpreting, translating (leveraging automated tools with human-in-the-loop validation), and semantically mapping data from diverse linguistic sources, preserving meaning and context.
- Validation Automation & Error Pattern Recognition: Build AI agents to run comprehensive data validation tool checks, identify common error types, suggest fixes, and automate common error corrections.
- Knowledge Graph RAG/RIG Integration: Integrate Retrieval Augmented Generation (RAG) and Retrieval Augmented Indexing (RIG) techniques to enhance querying capabilities and facilitate consistency checks within the Knowledge Graph.
- MLOps Implementation: Implement and maintain MLOps practices for the lifecycle management of AI models, including versioning, deployment, monitoring, and retraining on a relevant AI platform.
- Code Generation & Documentation Automation: Develop AI tools for generating reusable scripts, templates, and comprehensive import documentation to streamline development.
- Continuous Improvement Systems: Design and build learning systems, feedback loops, and error analytics mechanisms to continuously improve the accuracy and efficiency of AI-powered automation over time.
Required Skills and Qualifications:
- Bachelor's or Master's degree in Computer Science, Artificial Intelligence, Machine Learning, or a related quantitative field.
- Proven experience (e.g., 3+ years) as an AI/ML Engineer, with a strong portfolio of deployed AI solutions.
- Strong expertise in Natural Language Processing (NLP), including experience with Large Language Models (LLMs) and their applications in data processing.
- Proficiency in Python and relevant AI/ML libraries (e.g., TensorFlow, PyTorch, scikit-learn).
- Hands-on experience with cloud AI/ML services,
- Understanding of knowledge representation, ontologies (e.g., Schema.org, RDF), and knowledge graphs.
- Experience with data quality, validation, and anomaly detection techniques.
- Familiarity with MLOps principles and practices for model deployment and lifecycle management.
- Strong problem-solving skills and an ability to translate complex data challenges into AI solutions.
- Excellent communication and collaboration skills.
Preferred Qualifications:
- Experience with data integration projects, particularly with large-scale public datasets.
- Familiarity with knowledge graph initiatives.
- Experience with multilingual data processing and AI.
- Contributions to open-source AI/ML projects.
- Experience in an Agile development environment.
Benefits:
- Opportunity to work on a high-impact project at the forefront of AI and data integration.
- Contribute to solidifying a leading data initiative's role as a foundational source for grounding Large Models.
- Access to cutting-edge cloud AI technologies.
- Collaborative, innovative, and fast-paced work environment.
- Significant impact on data quality and operational efficiency.
What We’re Looking For:
- Strong experience in Python (3+ years).
- Hands-on experience with any database (SQL or NoSQL).
- Experience with frameworks like Flask, FastAPI, or Django.
- Knowledge of ORMs, API development, and unit testing.
- Familiarity with Git and Agile methodologies.
PlanetSpark is Hiring !!
Title of the Job : Data Analyst ( FULL TIME)
Location : Gurgaon
Roles and Responsibilities/Mission Statement:
We are seeking an experienced Data Analyst to join our dynamic team. The ideal candidate will possess a strong analytical mindset, excellent problem-solving skills, and a passion for uncovering actionable insights from data. As a Data Analyst, you will be responsible for collecting, processing, and analyzing large datasets to help inform business decisions and strategies and will be the source of company wide intelligence
The responsibilities would include :
1) Creating a robust Sales MIS
2) Tracking key metrics of the company
3) Reporting key metrics on a daily basis
4) Sales incentive and teacher payout calculation
5) Tracking and analyzing large volume of consumer data related to customers and teachers
6) Developing intelligence from data from various sources
Ideal Candidate Profile -
- 1-4 years of experience in a data-intensive position at a consumer business or a Big 4 firm
- Excellent ability in advanced excel
- Knowledge of other data analytics tools and software such as SQL, Python, R, Excel, and data visualization tools will be good to have.
- Detail-oriented with strong organizational skills and the ability to manage multiple projects simultaneously.
- Exceptional analytical ability
- Detail-oriented with strong organizational skills and the ability to manage multiple projects simultaneously.
Eligibility Criteria:
- Willing to work 5 Days a week from office and Saturday - work from home
- Willing to work in an early-stage startup .
- Must have 1-3 years of prior experience in a data focused role at a consumer internet or a Big 4
- Must have excellent analytical abilities
- Available to Relocate to Gurgaon
- Candidate Should his own laptop
- Gurgaon based candidate will be given more preference
Join us and leverage your analytical expertise to drive data-driven decisions and contribute to our success. Apply today!
What will I be doing?
- Ideate with key stakeholders, do market research and translate business goals to the product roadmap
- Define the product release process and coordinate all the processes required to bring the product to market
- ● Interview and understand customer needs, problems, and experiences, document and report insights
- Coordinate with other teams/ stakeholders to deliver the product in a continuously evolving environment.
- Identify and track key success metrics for all the products. Generate strategies to improve usage and take it to
- Prioritize features against what matters most to achieve the strategic goals and initiatives.
- Define requirements of the features from desired business use cases and write software requirement specifications - technical (database design, architecture)
- Coordinate with developers and outsourcing agencies to develop the products, guide developers, and review code if required.
- Anticipate possible issues in the future when the product is deployed to scale and take preventative steps for high uptime.
- Prepare product impact reports and present them to funders at regular intervals.
What skill do I need?
- Bachelor's degree or equivalent practical experience is a must in IT/Computer Engineering.
- Min 3 years of end-to-end software development experience and knowledge of tech stack (Angular, PHP, NodeJS, Android, and React) technical.
- Experience as a Product manager/owner in one of the business domains in Agriculture, Media& Advertisement, Healthcare, Payment
- Interested and curious to work at the intersection of multiple functional areas such as Product Management,
- Excellent communication (written and oral) and presentation
- Manage the product development process from conception through design, build, release, experimentation, analysis, and iteration
- Communicate the business values to the field and customers
- Proven organizational skills; ability to multi-task and work on several assignments concurrently but still be detail oriented.
Our client works as a battery swapping network so that instead of purchasing, it allows electric auto rickshaw drivers to swap batteries and pay for it as they use it. Their focus is on advantages like earning a non-stop battery for 150 KM, no waiting to charge till 100 KM and pay-as-you-go option till 50 KM.
Our client works as an App that focuses on providing comfort to people in terms of ride sharing, ride hailing and cashless payments. Their prominent features include digital payments, trained drivers, doorstep pickup, daily affordable pricing etc.
Headquartered in Delhi, our client was co-founded by three veteran entrepreneurs who are BITS Pilani alumni.
As a Fleet Sales Manager, you will be responsible for ensuring profitability of the company’s Fleet Sales division through the sale of swapping stations to the fleet operators and OEMs.
What you will do:
- Identifying target accounts and searching business contacts relevant to the company’s expansion opportunities
- Achieving set sales targets for the department
- Scheduling appointments with qualified leads, do the initial meeting and working upon closure
- Actively managing the sales process: lead generation, credentials pitch, consultative questioning, solutions pitch, negotiations and closing
- Maintaining client relationships and key accounts for the company
- Developing and maintaining sales funnel
- Creating compelling profit objectives and maximizing cross-selling and upselling opportunities by being up to date with company’s new product & service developments and constant analysis of client’s business needs vis-a-vis company’s offerings
- Developing strategic action plan for customer segmentation, competitor analysis to create value for the customers
Candidate Profile:
What you need to have:
- Graduation from any field
- 3+ years in a sales role
- Experience in B2B sales and corporate sales
- B2C sales
- Outstanding organizational skills
- Analytical mindset and good problem-solving skills
- Quantitative ability
- Attention to detail
- Exceptional interpersonal skills
- Excellent written and verbal communication
- Strong time-management skills
- Negotiation and project management skills
- Sr. Data Engineer:
Core Skills – Data Engineering, Big Data, Pyspark, Spark SQL and Python
Candidate with prior Palantir Cloud Foundry OR Clinical Trial Data Model background is preferred
Major accountabilities:
- Responsible for Data Engineering, Foundry Data Pipeline Creation, Foundry Analysis & Reporting, Slate Application development, re-usable code development & management and Integrating Internal or External System with Foundry for data ingestion with high quality.
- Have good understanding on Foundry Platform landscape and it’s capabilities
- Performs data analysis required to troubleshoot data related issues and assist in the resolution of data issues.
- Defines company data assets (data models), Pyspark, spark SQL, jobs to populate data models.
- Designs data integrations and data quality framework.
- Design & Implement integration with Internal, External Systems, F1 AWS platform using Foundry Data Connector or Magritte Agent
- Collaboration with data scientists, data analyst and technology teams to document and leverage their understanding of the Foundry integration with different data sources - Actively participate in agile work practices
- Coordinating with Quality Engineer to ensure the all quality controls, naming convention & best practices have been followed
Desired Candidate Profile :
- Strong data engineering background
- Experience with Clinical Data Model is preferred
- Experience in
- SQL Server ,Postgres, Cassandra, Hadoop, and Spark for distributed data storage and parallel computing
- Java and Groovy for our back-end applications and data integration tools
- Python for data processing and analysis
- Cloud infrastructure based on AWS EC2 and S3
- 7+ years IT experience, 2+ years’ experience in Palantir Foundry Platform, 4+ years’ experience in Big Data platform
- 5+ years of Python and Pyspark development experience
- Strong troubleshooting and problem solving skills
- BTech or master's degree in computer science or a related technical field
- Experience designing, building, and maintaining big data pipelines systems
- Hands-on experience on Palantir Foundry Platform and Foundry custom Apps development
- Able to design and implement data integration between Palantir Foundry and external Apps based on Foundry data connector framework
- Hands-on in programming languages primarily Python, R, Java, Unix shell scripts
- Hand-on experience in AWS / Azure cloud platform and stack
- Strong in API based architecture and concept, able to do quick PoC using API integration and development
- Knowledge of machine learning and AI
- Skill and comfort working in a rapidly changing environment with dynamic objectives and iteration with users.
Demonstrated ability to continuously learn, work independently, and make decisions with minimal supervision
What you’ll be doing:
● Write content for the blog and large story pieces. Interview senior facility management executives and real-estate leaders.
● Product content: Create an educational product and solution content across platforms.
● Customer marketing: Become an expert on our customers and their unique cases. Write compelling case study content in multiple formats (webpage, blog, video, social) that highlights how Facilio helps enterprises transform facilities experience.
● Sales Enablement: Communicate value proposition and high-value sales materials, presentations, partner training, and feature collaterals.
● Optimize all content for SEO for better reach, measure and improve content programs
What type of skills you’ll need:
● Previous experience: 1-3 yrs of experience of content writing or marketing experience, preferably at B2B tech product company
● You love writing stories. Have many examples to show. Can work with consistent feedback.
● You love to research, to dig below the surface, and uncover the truth.
● Customers are always your main focus and priority.












