
Salesforce Application Tester (Service Cloud, Amazon Connect, Oracle)
We are seeking a Salesforce Application Tester. Testing applications developed in Salesforce and Amazon Connect platform to become an integral part of our team! You will develop software test scenarios, test cases, test scripts and execute the same in order to identify software issues.
Responsibilities:
- Ability to develop software test scenarios, test cases, test scripts with detailed steps, and execute them.
- Knowledge of testing the applications developed in Salesforce Service Cloud, Amazon Connect.
- Devise and implement test strategies that adequately assess all software aspects.
- Investigate and recreate reported defects.
- Work with other engineers to troubleshoot and resolve coding issues.
- Track and document all testing defects and resolutions.
- Participate in the design and development for test automation and support.
Requirements:
- Previous experience in software development, quality assurance, or other related fields.
- Candidate must possess a bachelor’s degree or equivalent university degree.
- Familiarity with relational databases such as MySQL, Oracle, and SQL Server.
- Strong root-cause analysis skills.
- Deadline and detail-oriented.
Benefits:
- Work Location: Remote
- 5 days working
You can apply directly through the link: https://zrec.in/o0AQg?source=CareerSite
Explore our Career Page for more such jobs : careers.infraveo.com

Similar jobs
Key Responsibilities
- Manage daily accounting operations in Tally ERP
- Handle Accounts Payable: invoice verification, vendor payments & reconciliation
- Prepare and file GST returns, maintain GST records & ensure compliance
- Calculate, deduct & file TDS returns
- Maintain ledgers, books of accounts & financial records
- Coordinate with vendors for billing, statements, and payment follow-ups
- Assist in monthly, quarterly & yearly closing activities
- Support internal & external audits with proper documentation
- Prepare MIS and financial reports as needed
🎯 Requirements
- 1–3 years of experience in accounting
- Strong knowledge of Tally ERP, GST, TDS, Accounts Payable
- Good understanding of accounting principles and compliance
- Proficiency in MS Excel
- Detail-oriented with good communication & organizational skills
Procedure is hiring for Drover.
This is not a DevOps/SRE/cloud-migration role — this is a hands-on backend engineering and architecture role where you build the platform powering our hardware at scale.
About Drover
Ranching is getting harder. Increased labor costs and a volatile climate are placing mounting pressure to provide for a growing population. Drover is empowering ranchers to efficiently and sustainably feed the world by making it cheaper and easier to manage livestock, unlock productivity gains, and reduce carbon footprint with rotational grazing. Not only is this a $46B opportunity, you'll be working on a climate solution with the potential for real, meaningful impact.
We use patent-pending low-voltage electrical muscle stimulation (EMS) to steer and contain cows, replacing the need for physical fences or electric shock. We are building something that has never been done before, and we have hundreds of ranches on our waitlist.
Drover is founded by Callum Taylor (ex-Harvard), who comes from 5 generations of ranching, and Samuel Aubin, both of whom grew up in Australian ranching towns and have an intricate understanding of the problem space. We are well-funded and supported by Workshop Ventures, a VC firm with experience in building unicorn IoT companies.
We're looking to assemble a team of exceptional talent with a high eagerness to dive headfirst into understanding the challenges and opportunities within ranching.
About The Role
As our founding cloud engineer, you will be responsible for building and scaling the infrastructure that powers our IoT platform, connecting thousands of devices across ranches nationwide.
Because we are an early-stage startup, you will have high levels of ownership in what you build. You will play a pivotal part in architecting our cloud infrastructure, building robust APIs, and ensuring our systems can scale reliably. We are looking for someone who is excited about solving complex technical challenges at the intersection of IoT, agriculture, and cloud computing.
What You'll Do
- Develop Drover IoT cloud architecture from the ground up (it’s a green field project)
- Design and implement services to support wearable devices, mobile app, and backend API
- Implement data processing and storage pipelines
- Create and maintain Infrastructure-as-Code
- Support the engineering team across all aspects of early-stage development -- after all, this is a startup
Requirements
- 5+ years of experience developing cloud architecture on AWS
- In-depth understanding of various AWS services, especially those related to IoT
- Expertise in cloud-hosted, event-driven, serverless architectures
- Expertise in programming languages suitable for AWS micro-services (eg: TypeScript, Python)
- Experience with networking and socket programming
- Experience with Kubernetes or similar orchestration platforms
- Experience with Infrastructure-as-Code tools (e.g., Terraform, AWS CDK)
- Familiarity with relational databases (PostgreSQL)
- Familiarity with Continuous Integration and Continuous Deployment (CI/CD)
Nice To Have
- Bachelor’s or Master’s degree in Computer Science, Software Engineering, Electrical Engineering, or a related field
DataHavn IT Solutions is a company that specializes in big data and cloud computing, artificial intelligence and machine learning, application development, and consulting services. We want to be in the frontrunner into anything to do with data and we have the required expertise to transform customer businesses by making right use of data.
About the Role:
As a Data Scientist specializing in Google Cloud, you will play a pivotal role in driving data-driven decision-making and innovation within our organization. You will leverage the power of Google Cloud's robust data analytics and machine learning tools to extract valuable insights from large datasets, develop predictive models, and optimize business processes.
Key Responsibilities:
- Data Ingestion and Preparation:
- Design and implement efficient data pipelines for ingesting, cleaning, and transforming data from various sources (e.g., databases, APIs, cloud storage) into Google Cloud Platform (GCP) data warehouses (BigQuery) or data lakes (Dataflow).
- Perform data quality assessments, handle missing values, and address inconsistencies to ensure data integrity.
- Exploratory Data Analysis (EDA):
- Conduct in-depth EDA to uncover patterns, trends, and anomalies within the data.
- Utilize visualization techniques (e.g., Tableau, Looker) to communicate findings effectively.
- Feature Engineering:
- Create relevant features from raw data to enhance model performance and interpretability.
- Explore techniques like feature selection, normalization, and dimensionality reduction.
- Model Development and Training:
- Develop and train predictive models using machine learning algorithms (e.g., linear regression, logistic regression, decision trees, random forests, neural networks) on GCP platforms like Vertex AI.
- Evaluate model performance using appropriate metrics and iterate on the modeling process.
- Model Deployment and Monitoring:
- Deploy trained models into production environments using GCP's ML tools and infrastructure.
- Monitor model performance over time, identify drift, and retrain models as needed.
- Collaboration and Communication:
- Work closely with data engineers, analysts, and business stakeholders to understand their requirements and translate them into data-driven solutions.
- Communicate findings and insights in a clear and concise manner, using visualizations and storytelling techniques.
Required Skills and Qualifications:
- Strong proficiency in Python or R programming languages.
- Experience with Google Cloud Platform (GCP) services such as BigQuery, Dataflow, Cloud Dataproc, and Vertex AI.
- Familiarity with machine learning algorithms and techniques.
- Knowledge of data visualization tools (e.g., Tableau, Looker).
- Excellent problem-solving and analytical skills.
- Ability to work independently and as part of a team.
- Strong communication and interpersonal skills.
Preferred Qualifications:
- Experience with cloud-native data technologies (e.g., Apache Spark, Kubernetes).
- Knowledge of distributed systems and scalable data architectures.
- Experience with natural language processing (NLP) or computer vision applications.
- Certifications in Google Cloud Platform or relevant machine learning frameworks.
Job Responsibilities:
Section 1 -
- Responsible for managing and providing L1 support to Build, design, deploy and maintain the implementation of Cloud solutions on AWS.
- Implement, deploy and maintain development, staging & production environments on AWS.
- Familiar with serverless architecture and services on AWS like Lambda, Fargate, EBS, Glue, etc.
- Understanding of Infra as a code and familiar with related tools like Terraform, Ansible Cloudformation etc.
Section 2 -
- Managing the Windows and Linux machines, Kubernetes, Git, etc.
- Responsible for L1 management of Servers, Networks, Containers, Storage, and Databases services on AWS.
Section 3 -
- Timely monitoring of production workload alerts and quick addressing the issues
- Responsible for monitoring and maintaining the Backup and DR process.
Section 4 -
- Responsible for documenting the process.
- Responsible for leading cloud implementation projects with end-to-end execution.
Qualifications: Bachelors of Engineering / MCA Preferably with AWS, Cloud certification
Skills & Competencies
- Linux and Windows servers management and troubleshooting.
- AWS services experience on CloudFormation, EC2, RDS, VPC, EKS, ECS, Redshift, Glue, etc. - AWS EKS
- Kubernetes and containers knowledge
- Understanding of setting up AWS Messaging, streaming and queuing Services(MSK, Kinesis, SQS, SNS, MQ)
- Understanding of serverless architecture. - High understanding of Networking concepts
- High understanding of Serverless architecture concept - Managing to monitor and alerting systems
- Sound knowledge of Database concepts like Dataware house, Data Lake, and ETL jobs
- Good Project management skills
- Documentation skills
- Backup, and DR understanding
Soft Skills - Project management, Process Documentation
Ideal Candidate:
- AWS certification with between 2-4 years of experience with certification and project execution experience.
- Someone who is interested in building sustainable cloud architecture with automation on AWS.
- Someone who is interested in learning and being challenged on a day-to-day basis.
- Someone who can take ownership of the tasks and is willing to take the necessary action to get it done.
- Someone who is curious to analyze and solve complex problems.
- Someone who is honest with their quality of work and is comfortable with taking ownership of their success and failure, both.
Behavioral Traits
- We are looking for someone who is interested to be part of creativity and the innovation-based environment with other team members.
- We are looking for someone who understands the idea/importance of teamwork and individual ownership at the same time.
- We are looking for someone who can debate logically, respectfully disagree, and can admit if proven wrong and who can learn from their mistakes and grow quickly
Sourcing and Talent Acquisition:
- Utilize various channels (job boards, social media, networking, etc.) to source qualified candidates.
- Proactively identify and engage passive candidates.
- Build and maintain a strong talent pipeline.
- Employ recruitment marketing strategies to enhance employer branding.
Candidate Evaluation and Selection:
- Screen resumes and applications to assess candidate qualifications.
- Conduct phone, video, and in-person interviews.
- Evaluate candidates' skills, experience, and cultural fit.
- Perform reference checks and background checks.
- Manage the interview process and provide timely feedback to candidates.
Work Experience: 3 to 5 years
Official Notice Period – Immediate to 15 days ONLY
Excellent communication skills
Job Location – Mumbai
Mandatory Skills – Java 8, Spring boot, Hibernate, Microservices, Angular
They have partnered with 200+ MNCs for training of their working professionals and also for their alumni placement. They have partnered with 60+ universities and autonomous colleges to upskill students via university programs integrated with skill-based offerings
What you will do:
- Smooth onboarding of fellows post recruitment
- Understanding the pre-existing technical and other skills as well as improvement needs in each area throughout training and deployment
- Weekly monitoring of progress as well as key strength and improvement area of each fellow
- Planning remediation in consultation with training head and trainers, weekly/ daily check ins and query resolution to sustain high level of motivation
- Monitoring and taking steps towards maintaining high performance standards post deployment
- Implementing upskilling learning and development initiatives based on individual fellow requirement and ensuring consistent appraisal and manager satisfaction
- Continuously monitoring and improving the Fellows’ performance during training and on-site, as well as analyzing client feedback, upskilling needs, retention and billability
Desired Candidate Profile
What you need to have:- Minimum 10+ years’ experience in managing technical talent in either talent management or technical manager roles
- Training experience desirable
- Awareness of job role skill and capability requirement for data engineer or similar technical job role
- Experience in performance management and employment relations
- Strong coaching and mentoring capabilities
- High data-driven and process oriented mindset
- Strong interpersonal and communication skills
- Excellent relationship building skills
- Ability to influence and persuade and manage difficult conversations
- High levels of confidentiality
-
Preferred Education & Experience:
-
Bachelor’s or master’s degree in Computer Engineering, Computer Science, Computer Applications, Mathematics, Statistics or related technical field or equivalent practical experience. Relevant experience of at least 3 years in lieu of above if from a different stream of education.
-
Well-versed in and 5+ years of hands-on demonstrable experience with:
▪ Object Oriented Modeling, Design, & Programming
▪ Microservices Architecture, API Design, & Implementation
▪ Relational, Document, & Graph Data Modeling, Design, & Implementation -
Well-versed in and hands-on demonstrable experience with:
▪ Stream & Batch Big Data Pipeline Processing
▪ Distributed Cloud Native Computing
▪ Serverless Computing & Cloud Functions -
5+ years of hands-on development experience in Java programming.
-
3+ years of hands-on development experience in one or more libraries & frameworks such as Spring Boot, Apache Camel, Akka, etc.;
extra points if you can demonstrate your knowledge with working examples.
2+ years of hands-on development experience in one or more Relational and NoSQL datastores such as Amazon S3, Amazon DocumentDB, Amazon Elasticsearch Service, Amazon Aurora, AWS DynamoDB, Amazon Athena, etc. -
2+ years of hands-on development experience in one or more technologies such as Amazon Simple Queue Service, Amazon Kinesis, Apache Kafka, AWS Lambda, AWS Batch, AWS Glue, AWS Step Functions, Amazon API Gateway, etc.
-
2+ years of hands-on development experience in one or more technologies such as AWS Developer Tools, AWS Management & Governance, AWS Networking and Content Delivery, AWS Security, Identity, and Compliance, etc.
-
Well-versed in Virtualization & Containerization; must demonstrate experience in technologies such as Kubernetes, Istio, Docker, OpenShift, Anthos, Oracle VirtualBox, Vagrant, etc.
-
Well-versed with demonstrable working experience with API Management, API Gateway, Service Mesh, Identity & Access Management, Data Protection & Encryption.
-
Hands-on experience with demonstrable working experience with DevOps tools and platforms viz., Jira, GIT, Jenkins, Code Quality & Security Plugins, Maven, Artifactory, Terraform, Ansible/Chef/Puppet, Spinnaker, etc.
-
Well-versed with Storage, Networks and Storage Networking basics which will enable you to work in a Cloud environment.
-
Experience : 5+Years
-
Job Location : Remote/Pune
responsibilities:
• Mapping end-to-end business processes and identifying opportunities for automation
• Becoming Certified in Django Full stack Web technologies
• Designing, Testing and Launching Web solutions
Requirements:
• Undergraduate degree in a relevant discipline required
• Minimum of 4-6 years relevant professional experience in working on operational process
improvement and/or transformational projects
• Minimum of 4-6 years’ hands-on technology or analytics experience in web application
development
• Broad Technology knowledge experience including knowledge in specific development
technologies such as in Python, JavaScript, Django, MySQL
• Knowledge of database technologies such as SQL Server, SSIS/SSRS, and analytics tools will
be an advantage
• Knowledge and hands-on experience in following is must have: Django Full stack Web
Developer. Expert in either Frontend or Backend with hands-on experience in Python,
JavaScript, Django, MySQL
• Knowledge and hands-on experience in one or more of the following is good to have: Django
Full stack Web Developer. Expert in both Frontend and Backend with hands-on experience in
Python, JavaScript, Django, MySQL
• Highly developed project management and service delivery skills gained through direct
relevant experience
• Experience working in Financial Services industry preferred
Please apply through below mentioned link.
https://forms.gle/tdBjPVZAjbyGF1u76" target="_blank">https://forms.gle/tdBjPVZAjbyGF1u76
- Writing financial documents, news items, articles, research Reports.
- Keeping updated on various financial regulatory and global activities.
- Keeping watch on various global stock markets especially US Market.
- Writing various financial documents for clients majorly Fintech Companies, agencies.
- Developing content for print, online, and presentation materials.
What you need to have:
- Minimum of 1 year experience in Financial report writing
- Exposure to US financial/Security market
- Candidates from financial background or experience in financial content writing - articles, blogs and from financial news candidates/financial writing companies.








