
Ideal Candidate
6+ years of experience selling B2B SaaS to US customers
Prior success as one of the first AEs at a startup; comfortable working in a fast-paced, early stage startup environment
Track record of consistently beating your quota
Solid hustle and entrepreneurial mindset; Thrive under the uncertainty that comes with an early stage setup
Strong project management skills and proactiveness, given the long sales cycle and pilot (free trials) heavy nature of deals.
Strong first principles understanding of business fundamentals; Ability to understand the product at a granular level; Ability to empathise with the end user, and understand their pain points
Strong first principles understanding of different stages of the sales funnel, and experience with good CRM hygiene
Strong first principles understanding of sales processes e.g. MEDDPICC
Willingness to get your hands dirty and do the grunt work in the early days.
MANDATORY - Willing to work in the US time zone (~4 am IST) on weekdays (Mon-Fri) because the majority of our clients are based in the US. You have flexibility on when you want to start your day in the afternoon (IST), or take breaks (e.g. family time, gym etc). This role will require you to stretch outside your comfort zone.

Similar jobs


We are seeking a Cloud Architect for a Geocode Service Center Modernization Assessment and Implementation project. The primary objectives of the project are to migrate the legacy Geocode Service Center to a cloud-based solution. Initial efforts will be leading Assessments and Design efforts and ultimately, implementation of approved design.
Responsibilties:
- System Design and Architecture: Design and develop scalable, cloud-based geocoding systems that meet business requirements.
- Integration: Integrate geocoding services with existing cloud infrastructure and applications.
- Performance Optimization: Optimize system performance, ensuring high availability, reliability, and efficiency.
- Security: Implement robust security measures to protect geospatial data and ensure compliance with industry standards.
- Collaboration: Work closely with data scientists, developers, and other stakeholders to understand requirements and deliver solutions.
- Innovation: Stay updated with the latest trends and technologies in cloud computing and geospatial analysis to drive innovation.
- Documentation: Create and maintain comprehensive documentation for system architecture, processes, and configurations.
Requirements:
- Educational Background: Bachelor’s or Master’s degree in Computer Science, Information Technology, Geography, or a related field.
- Technical Proficiency: Extensive experience with cloud platforms (e.g., AWS, Azure, Google Cloud) and geocoding tools like Precisely, ESRI etc.
- Programming Skills: Proficiency in programming languages such as Python, Java, or C#.
- Analytical Skills: Strong analytical and problem-solving skills to design efficient geocoding systems.
- Experience: Proven experience in designing and implementing cloud-based solutions, preferably with a focus on geospatial data.
- Communication Skills: Excellent communication and collaboration skills to work effectively with cross-functional teams.
- Certifications: Relevant certifications in cloud computing (e.g., AWS Certified Solutions Architect) and geospatial technologies are a plus.
Benefits:
- Work Location: Remote
- 5 days working
You can apply directly through the link:https://zrec.in/il0hc?source=CareerSite
Explore our Career Page for more such jobs : careers.infraveo.com



- Previous working experience as a MySQL Developer for at least 3 years
- Identify opportunities for improved performance in SQL operations and implementations
- Oversee the operations of the production and staging environments databases
- Give design recommendations for database functions that meet business operating standards while improving the efficiency of business processes
- Train and mentor junior personnel on best practices
Candidate Profile:
- Bachelor’s/Master’s degree in Engineering, Computer Science
- At least 5+ years of relevant experience as a database programmer
- Excellent MySQL/PostgreSQL/MS-SQL development skills
- Experience in writing stored views, procedures, triggers etc.
- Excellent knowledge in RDBMS (important features)
- Strong problem-solving skills, technical troubleshooting, and diagnosing
- Solid knowledge of RDBMS and NoSQL technologies
- Experience in developing back-ends for enterprise systems
- Knowledge of debugging, performance and optimization techniques
- Experience in RDBMS technologies like MySQL, PostgreSQL etc.
- Experience in No SQL technologies like MongoDB, Cassandra etc.
- Knowledge of Caching DB like Redis, Memcached etc.
- Knowledge of Search DB like Solr, Elasticsearch etc.
- Demonstrated ability to deliver in a fast-paced environment.


**We are hiring**
SURVEYSENSUM is looking for a Backend Engineer.
We're looking to expand our https://www.linkedin.com/feed/hashtag/?keywords=gurugramteam&highlightedUpdateUrns=urn%3Ali%3Aactivity%3A6930462908086644736">#gurugramTeam.
Requirement: 3 positions
Experience :- 1-3 years
CTC :- 5 to 14 LPA
Skills :- C#/Java, http://asp.net/">ASP.net, .Netcore .Net, MySQL
● 1-3 Years of Software Development experience.
● Experience in identityserver4
● Experience in oAuth2.
● Experience in developing Apps using third party APIs, for example slack, facebook.
● Proven experience in ASP.NETcore Web API applications
● Proven experience with software design and OOD methodologies.
● Familiarity with relational databases and SQL.
● BS degree in computer science & Engineering or equivalent.
● Experience with web services development (REST)
● Strong in object-oriented programming, Design patterns and solid principles.
Should you/someone in your network fits the bill, plz reach out to me on Linkedin Meenakshi Vaidwan

We are looking for an outstanding Big Data Engineer with experience setting up and maintaining Data Warehouse and Data Lakes for an Organization. This role would closely collaborate with the Data Science team and assist the team build and deploy machine learning and deep learning models on big data analytics platforms.
Roles and Responsibilities:
- Develop and maintain scalable data pipelines and build out new integrations and processes required for optimal extraction, transformation, and loading of data from a wide variety of data sources using 'Big Data' technologies.
- Develop programs in Scala and Python as part of data cleaning and processing.
- Assemble large, complex data sets that meet functional / non-functional business requirements and fostering data-driven decision making across the organization.
- Responsible to design and develop distributed, high volume, high velocity multi-threaded event processing systems.
- Implement processes and systems to validate data, monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.
- Perform root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- Provide high operational excellence guaranteeing high availability and platform stability.
- Closely collaborate with the Data Science team and assist the team build and deploy machine learning and deep learning models on big data analytics platforms.
Skills:
- Experience with Big Data pipeline, Big Data analytics, Data warehousing.
- Experience with SQL/No-SQL, schema design and dimensional data modeling.
- Strong understanding of Hadoop Architecture, HDFS ecosystem and eexperience with Big Data technology stack such as HBase, Hadoop, Hive, MapReduce.
- Experience in designing systems that process structured as well as unstructured data at large scale.
- Experience in AWS/Spark/Java/Scala/Python development.
- Should have Strong skills in PySpark (Python & SPARK). Ability to create, manage and manipulate Spark Dataframes. Expertise in Spark query tuning and performance optimization.
- Experience in developing efficient software code/frameworks for multiple use cases leveraging Python and big data technologies.
- Prior exposure to streaming data sources such as Kafka.
- Should have knowledge on Shell Scripting and Python scripting.
- High proficiency in database skills (e.g., Complex SQL), for data preparation, cleaning, and data wrangling/munging, with the ability to write advanced queries and create stored procedures.
- Experience with NoSQL databases such as Cassandra / MongoDB.
- Solid experience in all phases of Software Development Lifecycle - plan, design, develop, test, release, maintain and support, decommission.
- Experience with DevOps tools (GitHub, Travis CI, and JIRA) and methodologies (Lean, Agile, Scrum, Test Driven Development).
- Experience building and deploying applications on on-premise and cloud-based infrastructure.
- Having a good understanding of machine learning landscape and concepts.
Qualifications and Experience:
Engineering and post graduate candidates, preferably in Computer Science, from premier institutions with proven work experience as a Big Data Engineer or a similar role for 3-5 years.
Certifications:
Good to have at least one of the Certifications listed here:
AZ 900 - Azure Fundamentals
DP 200, DP 201, DP 203, AZ 204 - Data Engineering
AZ 400 - Devops Certification


Candidates shortlisted in machine test will have to fill the Job Application Form shared by us & go for final interview round with Software Department Head
Responsibilities
- Manual Testing + Test Automation
- Manual testing (Functional), Database testing
- Exposure in Selenium, Javascript, etc
- Exposure in Agile and DevOps
- Excellent communication skills
- Review requirements, specifications and technical design documents to provide timely and meaningful feedback
- Create detailed, comprehensive and well-structured test plans and test cases
- Estimate, prioritize, plan and coordinate testing activities
- Design, develop and execute automation scripts using open source tools
- Identify, record, document thoroughly and track bugs
- Perform thorough regression testing when bugs are resolved
- Develop and apply testing processes for new and existing products to meet client needs
- Liaise with internal teams (e.g. developers and product managers) to identify system requirements
- Monitor debugging process results
- Investigate the causes of non-conforming software and train users to implement solutions
- Track quality assurance metrics, like defect densities and open defect counts
Stay up to date with new testing tools and test strategies
Requirements
- Proven work experience in software development
- Proven work experience in software quality assurance
- Strong knowledge of software QA methodologies, tools and processes
- Experience in writing clear, concise and comprehensive test plans and test cases
- Hands-on experience with both white box and black box testing
- Hands-on experience with automated testing tools
- Solid knowledge of SQL and scripting
- Experience working in an Agile/Scrum development process
- Experience with performance and/or security testing is a plus
- BS/MS degree in Computer Science, Engineering or a related subject
- Knowledge about API/Service Testing would be important
- 6+ years of relevant experience in DB2 LUW Administration
- Good experience with
- Performance tuning and troubleshooting
- High availability solutions HACMP, TSA, MSCS Cluster
- Monitoring, backup, recovery, IBM Data Server Manager, TSM , Commvault
- Data replication, Q-replication & CDC
- Implementing DB2 key features
- Desirable experience with Db2 Pure scale
- Experience with tools i.e. ITM, Nagios, Service Now
- Experience with automation and scripting such as CRON, PowerShell, Shell Scripting
- Experience with configuring and usage of Clustering, db2diag and notification logs, snapshot and event monitor
- Experience with use of problem and change management tools

