
US IT Recruiter
We Aggressively hiring for freshers with good communications skills for our US IT, US Sales domain at our Vadodara Gujrat locations.
Fresh Passed out Graduates, Post Graduates can apply.
No pursuing candidates are required. days per week fixed Saturday and Sunday off. US shift.

Similar jobs


Position Overview: We are looking for an experienced and highly skilled Data Architect to join our team and help design, implement, and optimize data systems that support high-end analytical solutions for our clients. As a Data Architect, you will work closely with clients to understand their business needs and translate them into robust, scalable, and efficient technical solutions. You will be responsible for end-to-end data modelling, integration workflows, and data transformation processes while ensuring security, privacy, and compliance.In this role, you will also leverage the latest advancements in artificial intelligence, machine learning, and large language models (LLMs) to deliver high-impact solutions that drive business success. The ideal candidate will have a deep understanding of data infrastructure, optimization techniques, and cost-effective data management
Key Responsibilities:
Customer Collaboration:
- Partner with clients to gather and understand their business
requirements, translating them into actionable technical specifications.
- Act as the primary technical consultant to guide clients through data challenges and deliver tailored solutions that drive value.
Data Modeling & Integration:
- Design and implement scalable, efficient, and optimized data models to support business operations and analytical needs.
- Develop and maintain data integration workflows to seamlessly extract, transform, and load (ETL) data from various sources into data repositories.
- Ensure smooth integration between multiple data sources and platforms, including cloud and on-premise systems
Data Processing & Optimization:
- Develop, optimize, and manage data processing pipelines to enable real-time and batch data processing at scale.
- Continuously evaluate and improve data processing performance, optimizing for throughput while minimizing infrastructure costs.
Data Governance & Security:
- Implement and enforce data governance policies and best practices, ensuring data security, privacy, and compliance with relevant industry regulations (e.g., GDPR, HIPAA).
- Collaborate with security teams to safeguard sensitive data and maintain privacy controls across data environments.
Cross-Functional Collaboration:
- Work closely with data engineers, data scientists, and business
analysts to ensure that the data architecture aligns with organizational objectives and delivers actionable insights.
- Foster collaboration across teams to streamline data workflows and optimize solution delivery.
Leveraging Advanced Technologies:
- Utilize AI, machine learning models, and large language models (LLMs) to automate processes, accelerate delivery, and provide
- smart, data-driven solutions to business challenges.
- Identify opportunities to apply cutting-edge technologies to improve the efficiency, speed, and quality of data processing and analytics.
Cost Optimization:
- Proactively manage infrastructure and cloud resources to optimize throughput while minimizing operational costs.
- Make data-driven recommendations to reduce infrastructure overhead and increase efficiency without sacrificing performance.
Qualifications:
Experience:
- Proven experience (5+ years) as a Data Architect or similar role, designing and implementing data solutions at scale.
- Strong expertise in data modelling, data integration (ETL), and data transformation processes.
- Experience with cloud platforms (AWS, Azure, Google Cloud) and big data technologies (e.g., Hadoop, Spark).
Technical Skills:
- Advanced proficiency in SQL, data modelling tools (e.g., Erwin,PowerDesigner), and data integration frameworks (e.g., Apache
- NiFi, Talend).
- Strong understanding of data security protocols, privacy regulations, and compliance requirements.
- Experience with data storage solutions (e.g., data lakes, data warehouses, NoSQL, relational databases).
AI & Machine Learning Exposure:
- Familiarity with leveraging AI and machine learning technologies (e.g., TensorFlow, PyTorch, scikit-learn) to optimize data processing and analytical tasks.
- Ability to apply advanced algorithms and automation techniques to improve business processes.
Soft Skills:
- Excellent communication skills to collaborate with clients, stakeholders, and cross-functional teams.
- Strong problem-solving ability with a customer-centric approach to solution design.
- Ability to translate complex technical concepts into clear, understandable terms for non-technical audiences.
Education:
Bachelor’s or Master’s degree in Computer Science, Information Systems, Data Science, or a related field (or equivalent practical experience).
LIFE AT FOUNTANE:
- Fountane offers an environment where all members are supported, challenged, recognized & given opportunities to grow to their fullest potential.
- Competitive pay
- Health insurance for spouses, kids, and parents.
- PF/ESI or equivalent
- Individual/team bonuses
- Employee stock ownership plan
- Fun/challenging variety of projects/industries
- Flexible workplace policy - remote/physical
- Flat organization - no micromanagement
- Individual contribution - set your deadlines
- Above all - culture that helps you grow exponentially!
A LITTLE BIT ABOUT THE COMPANY:
Established in 2017, Fountane Inc is a Ventures Lab incubating and investing in new competitive technology businesses from scratch. Thus far, we’ve created half a dozen multi-million valuation companies in the US and a handful of sister ventures for large corporations, including Target, US Ventures, and Imprint Engine.
We’re a team of 120+ strong from around the world that are radically open-minded and believes in excellence, respecting one another, and pushing our boundaries to the furthest it's ever been.
1. 8 to 12 of total IT experience and has full exposure of end-to-end SDLC
2. 1 to 2 years of experience as Solution Architect
3. Should have basic understanding in Architecture methodologies like TOGAF
4. Should have working knowledge of working in Architecture tools like Visio or any other EA tool
5. Should have experience in delivering Architecture artifacts diagrams like Business Architecture, Application Architecture and Deployment Architecture
6. Experienced and implemented solutions for Architecture patterns like Microservices, Event driven, REST API’s
7. Ability to illicit both functional and Non-functional requirements for business
8. Able to provide Solutions to address all NFR requirements like Scalability, performance, Security, Availability (HA/DR and other concepts) and Resiliency (RTO/RPO and other concepts)
9. Understanding of various network protocols- HTTPS, TCP/IP etc., Authentication mechanisms like LDAP etc.
About Tibco
Headquartered in Palo Alto, CA, TIBCO Software enables businesses to reach new heights on their path to digital distinction and innovation. From systems to devices and people, we interconnect everything, capture data in real time wherever it is, and augment the intelligence of organizations through analytical insights. Thousands of customers around the globe rely on us to build compelling experiences, energize operations, and propel innovation. Our teams flourish on new ideas and welcome individuals who thrive in transforming challenges
into opportunities. From designing and building amazing products to providing excellent service;we encourage and are shaped by bold thinkers, problem-solvers, and self-starters. We are always adapting and providing exciting opportunities for our employees to grow, learn and excel.
We value the customers and employees that define who we are; dynamic individuals willing to take the risks necessary to make big ideas come to life and who are comfortable collaborating in our creative, optimistic environment. TIBCO – we are just scratching the surface.
Who You’ll Work With
TIBCO Data Virtualization (TDV) is an enterprise data virtualization solution that orchestrates access to multiple and varied data sources, delivering data sets and IT curated data services to any analytics solution. TDV is a Java based enterprise-grade database engine supporting all phases of data virtualization development, run-time, and management. It is the trusted solution of choice for the top enterprises in verticals like finance, energy, pharmaceutical, retail, telecom
etc. Are you interested in working on leading edge technologies? Are you fascinated with Big Data,Cloud, Federation and Data Pipelines? If you have built software frameworks and have a background in Data Technologies, Application Servers, Business Intelligence etc this opportunity is for you.
Overview
TIBCO Data Virtualization team is looking for a engineer with experience in the area of SQL Data Access using JDBC, WebServices, and native client access for both relational as well as non-relational sources. You will have expertise in developing metadata layer around disparate data sources and implementing a query runtime engine for data access, including plugin management. The core responsibilities will include designing, implementing and maintaining the
subsystem that abstracts data and metadata access across different relational database flavors, BigData sources, Cloud applications, enterprise application packages like SAP R/3, SAP BW, Salesforce etc. The server is implemented by a multi-million line source base in Java, so the ability to understand and integrate with existing code is an absolute must. The core runtime is a complex multi-threaded system and the successful candidate will demonstrate complete expertise in handling features geared towards concurrent transactions in a low latency, high throughput and scalable server environment. The candidate will have the opportunity to work in a collaborative environment with leading database experts in building the most robust, scalable and high performing database server.
Job Responsibilities
• In this crucial role as a Data Source Engineer, you will:
• Drive enhancements to existing data-source layer capabilities
• Understand and interface with 3rd party JDBC drivers
• Ensure all security-related aspects of driver operation function with zero defects
• Diagnose customer issues and perform bug fixes
• Suggest and implement performance optimizations
Required Skills
• Bachelor’s degree with 3+ years of experience, or equivalent work experience.
• 3+ years programming experience
• 2+ years of Java based server side experience
• 1+ years experience with at least one of JDBC, ODBC, SOAP, REST, and OData
• 1+ years of multithreading experience
• Proficiency in both spoken and written communication in English is a must
Desired Skills
• Strong object-oriented design background
• Strong SQL & database background
• Experience developing or configuring cloud-based software
• Experience with all lifecycle aspects of enterprise software
• Experience working with large, pre-existing code bases
• Experience with enterprise security technologies
• Experience with any of the following types of data sources: Relational, Big Data, Cloud, Data
Lakes, and Enterprise Applications.
• Experience using Hive, Hadoop, Impala, Cloudera, and other Big Data technologies


Job Description
Responsibilities:
- Collaborate with stakeholders to understand business objectives and requirements for AI/ML projects.
- Conduct research and stay up-to-date with the latest AI/ML algorithms, techniques, and frameworks.
- Design and develop machine learning models, algorithms, and data pipelines.
- Collect, preprocess, and clean large datasets to ensure data quality and reliability.
- Train, evaluate, and optimize machine learning models using appropriate evaluation metrics.
- Implement and deploy AI/ML models into production environments.
- Monitor model performance and propose enhancements or updates as needed.
- Collaborate with software engineers to integrate AI/ML capabilities into existing software systems.
- Perform data analysis and visualization to derive actionable insights.
- Stay informed about emerging trends and advancements in the field of AI/ML and apply them to improve existing solutions.
Strong experience in Apache pyspark is must
Requirements:
- Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
- Proven experience of 3-5 years as an AI/ML Engineer or a similar role.
- Strong knowledge of machine learning algorithms, deep learning frameworks, and data science concepts.
- Proficiency in programming languages such as Python, Java, or C++.
- Experience with popular AI/ML libraries and frameworks, such as TensorFlow, Keras, PyTorch, or scikit-learn.
- Familiarity with cloud platforms, such as AWS, Azure, or GCP, and their AI/ML services.
- Solid understanding of data preprocessing, feature engineering, and model evaluation techniques.
- Experience in deploying and scaling machine learning models in production environments.
- Strong problem-solving skills and ability to work on multiple projects simultaneously.
- Excellent communication and teamwork skills.
Preferred Skills:
- Experience with natural language processing (NLP) techniques and tools.
- Familiarity with big data technologies, such as Hadoop, Spark, or Hive.
- Knowledge of containerization technologies like Docker and orchestration tools like Kubernetes.
- Understanding of DevOps practices for AI/ML model deployment
-Apache ,Pyspark
- Understand the complex needs of Japanese users
- Design and develop prototypes for solution
- Design Figma and brainstorm on the issues
- Convert figma designs into HTML/CSS or react code
- Code HTML/CSS
- Code in React JS
- Can design all parts of UI including assets
Job Description
- Bachelor’s degree in software engineering, computer science or similar program.
. Need 2+ years of experience in Sales force who is technically strong and capable of providing solutions.
· Minimum 2 years into SFDC CPQ implementations.
· Strong written and spoken English language communication skills
· Deep knowledge on Salesforce configuration including objects, fields, profiles, roles, workflows, approval processes, process builders, etc.
· Strong knowledge of APEX, Visualforce and the Lightning platform development framework
· Solid understanding of data migration tools and processes with regard to Salesforce
· Solid experience and knowledge developing integrations with Salesforce using REST or SOAP for example.
· Ability to clearly answer questions on the different design options available on the Salesforce platform and the trade off’s which exist between different approaches
· Lightning Components, Rest API , Apex Best Practices, Strong enthusiasm to learn new domains and technologies.
. Should be able to lead the team and guide the juniors during the implementation.
Technical Skills
- Facilitate business process reviews to identify client requirements and processes. Able to convert client requirements into Salesforce CPQ design, leveraging Best Practices and minimizing the need for custom development. Configure Salesforce CPQ and Sales Cloud solutions
• Atleast 1+ CPQ full project end to end implementations experience
• Extensive experience in integrating 3rd party apps to salesforce CPQ
• Extensive experience in configuring QCP (Quote Calculator Plugin), Custom Actions, Custom Scripts, Advanced Grouping, Price Rules, Product Rules, Summary Variables, Designing of custom CPQ Advanced Quote templates, Product Bundles, Configuration attributes, Types of Discount Schedules, Pricing Models
• CPQ Certification is an added advantage.

- Your primary focus will be the development of Android applications and their integration with back-end services.
- You will be working alongside other engineers and developers working on different layers of the infrastructure. Therefore, commitment to collaborative problem solving, sophisticated design, and creating quality products is essential.
Skills :
- Strong knowledge of Android SDK, different versions of Android, and how to deal with different screen sizes
- Familiarity with RESTful APIs to connect Android applications to back-end services
- Strong knowledge of Android UI design principles, patterns, and best practices
- Experience with offline storage, threading, and performance tuning
- Ability to design applications around natural user interfaces, such as touch
- Familiarity with the use of additional sensors, such as gyroscopes and accelerometers
- Knowledge of the open-source Android ecosystem and the libraries available for common tasks
- Ability to understand business requirements and translate them into technical requirements
- Familiarity with cloud message APIs and push notifications
- A knack for benchmarking and optimization
- Understanding of Google's Android design principles and interface guidelines
- Proficient understanding of code versioning tools, such as Git
- Familiarity with continuous integration - Kotlin, Java
- Important libraries include Dagger, RxJava, Realm - MVP Clean Architecture
- Understanding of Data Structures and Algorithm.
- Basic Knowledge of Kotlin
- Skills : - Android, Web Sockets, MVP Development, Rxjava, Realm Mobile Database,

- Ensuring Timely development of bench-top prototypes, stability study, sensory and scale up for commercialisation.
- Identifying new market opportunities and developing new product concepts to address customer and market demand
- Developing & managing time-lines, budgets, and product development activities
- Identifying & reviewing new ingredients, products and technologies to determine scientific viability and applications
- Identifying source and partners of raw material ingredients. Actively searching products, vendors which will support the development and pipeline
- Collaborating with internal and external stakeholders, such as Procurement, Quality Assurance, Nutrition, Regulatory, Business development & Manufacturing
- Managing multiple complex projects simultaneously
- Reviewing Documents for:
- Stability Study Protocol/ Reports
- Process Validation Protocol & Reports
- Observation report after three successful commercial batches
- Affiliation to various govt standards for similar product range
- Discussing & helping in resolving the formulation or process challenges during daily meeting for product development & monitoring the development cycle of product
- Identifying achievements, failures & bottleneck on weekly basis for review
- Reviewing & helping in preparing the monthly performance of the team based on the time-lines committed & submission of the same
- Being responsible for environment health and safety In organisation as per company policy/ regulations
What you need to have:
- 12+ years of experience in Nutra vertical and Ayurvedic products industry. Must be from Food Technology Background
- Exposure to ERP/ SAP
- Project management & data interpretation skills
- Ability to make & deliver process training sessions
- Knowledge of domestic and international markets
- Proficient with use of the Internet, MS outlook, MS office - including word, excel and power point
- Highly technical/ scientific acumen and familiarity with ayurvedic and Nutraceutical ingredient raw materials
- Knowledge of food science, process technology and good understanding of nutrition & Food regulations
- Talent for communicating, influencing, problem-solving and analytical skills
- Analytical thought process and excellent organisational skills

Exp: 2-8 Years
Immediate joiners or serving Notice only
Qualification BTech or BCA+MCA Only
Location – Gurgaon
Requirement - 6
Must to have Angular 2/4/6/8/10 or Upper version, C# & WebAPI
- Programming experience using C#, Type Script languages.
- Programming experience using Angular2 & upper versions.
- Programming experience using RestAPI & WebAPI.
- Experience using relevant tool Microsoft Visual studio and Visual studio code.
- Extensive knowledge of agile methodologies.


