
We are looking for a self motivated and passionate individual, with strong desire to learn and ability to lead. This position is for a Flight Test Engineer, with exposure to building and flying sUAS (RC Multirotors and Fixed wings). See the detailed job description below.
Responsibilities
• Plan and execute flight test plans for new software features, electronics, sensors, and payloads.
• Perform hands-on mechanical and electrical integration of new hardware components on the internal fleet of test vehicles for R&D and testing.
• Troubleshoot and debug any components of a drone in the office or in the field. Maintenance of vehicles – keep the fleet ready for flight tests.
• Participate in defining and validating customer workflows and enhancing User experience.
• Coordinate cross-team efforts among FlytBase engineers to resolve issues identified during flight tests.
• Drive collaboration with FlytBase Developer team, Business Development team and Customer Support team to incorporate customer feedback and feature requests into FlytBase’s product development cycle.
• Learn about the domain and competitors to propose new drone applications, as well as, improvements in existing applications
Experience/Skills
• Experience in flight testing and operating/piloting small UAS and/or RC aircraft (both fixed-wing and multirotor systems).
• Experience in using flight-planning and ground control station software.
•Familiarity with UAV platforms, like, Pixhawk, DJI, Ardupilot and PX4.
•Experience in integrating, operating, and tuning autopilots on a variety of unmanned vehicles.
•Basic knowledge of electrical test equipment (multimeter, oscilloscope) and UAS sensors.
•Ability to work hands-on with electro-mechanical systems including assembly, disassembly, testing, troubleshooting.
•Good verbal and written communication skills.
Good to have
• RF communications fundamentals.
• Passionate about aerial robots i.e. drones.
• Programming languages and scripting for engineering use (C++, C, MATLAB, Python).
Compensation:
As per industry standards.
Perks:
+ Fast-paced Startup culture
+ Hacker mode environment
+ Great team
+ Flexible work hours
+ Informal dress code
+ Free snacks

About Flytbase
About
About FlytBase:
FlytBase is building the world’s leading software platform for autonomous drone operations. Our technology enables enterprises to automate aerial data collection using docking stations—executing BVLOS flights with minimal human intervention.
With global recognitions like the NTT Data Innovation Award (Japan) and the TiE50 Award (Silicon Valley), FlytBase is at the forefront of drone-tech innovation—trusted by enterprises for scalability, security, and seamless integration into their workflows.
Connect with the team
Similar jobs

Hi,
We are seeking a senior data leader with deep functional expertise in Salesforce Sales and Service domains to own the enterprise data model, metrics, and analytical outcomes supporting Sales, Service, and Customer Operations.
This role is business‑first and data‑centric. The successful candidate understands how Salesforce Sales Cloud and Service Cloud data is generated, evolves over time, and is consumed by business teams, and ensures analytics accurately reflect operational reality.
Snowflake serves as the enterprise analytics platform, but Salesforce domain mastery and functional data expertise are the primary requirements for success in this role.
Core Responsibilities
Salesforce Sales & Service Data Ownership
· Act as the data owner and architect for Salesforce Sales and Service domains.
- Own Sales data including leads, accounts, opportunities, pipeline, bookings, revenue, forecasting, and CPQ (if applicable).
- Own Service data including cases, case lifecycle, SLAs, backlog, escalations, and service performance metrics.
- Define and govern enterprise‑wide KPI and metric definitions across Sales and Service.
- Ensure alignment between Salesforce operational definitions and analytics/reporting outputs.
- Own cross‑functional metrics spanning Sales, Service, and the customer lifecycle (e.g., customer health, renewals, churn).
Business‑Driven Data Modeling
· Design Salesforce‑centric analytical data models that accurately reflect Sales and Service processes.
- Model sales stage progression, pipeline history, and forecast changes over time.
- Model service case lifecycle, SLA compliance, backlog aging, and resolution metrics.
- Handle Salesforce‑specific complexities such as slowly changing dimensions (ownership, territory, account hierarchies).
- Ensure data models support operational dashboards, executive reporting, and advanced analytics.
Analytics Enablement & Business Partnership
· Partner closely with Sales Operations, Service Operations, Revenue Operations, Finance, and Analytics teams.
- Translate business questions into trusted, reusable analytical datasets.
- Identify data quality issues or Salesforce process gaps impacting reporting and drive remediation.
- Enable self‑service analytics through well‑documented, certified data products.
Technical Responsibilities (Enabling Focus)
· Architect and govern Salesforce data ingestion and modeling on Snowflake.
- Guide ELT/ETL strategies for Salesforce objects such as Opportunities, Accounts, Activities, Cases, and Entitlements.
- Ensure reconciliation and auditability between Salesforce, Finance, and analytics layers.
- Define data access, security, and governance aligned with Salesforce usage patterns.
- Partner with data engineering teams on scalability, performance, and cost efficiency.
Required Experience & Skills
Salesforce Sales & Service Domain Expertise (Must‑Have)
· Extensive hands‑on experience working with Salesforce Sales Cloud and Service Cloud data.
- Strong understanding of sales pipeline management, forecasting, and revenue reporting.
- Strong understanding of service case workflows, SLAs, backlog management, and service performance measurement.
- Experience working directly with Sales Operations and Service Operations teams.
- Ability to identify when Salesforce configuration or process issues cause reporting inconsistencies.
Data & Analytics Expertise
· 10+ years working with business‑critical analytical data.
- Proven experience defining KPIs, metrics, and semantic models for Sales and Service domains.
- Strong SQL and analytical skills to validate business logic and data outcomes.
- Experience supporting BI and analytics platforms such as Tableau, Power BI, or MicroStrategy.
Platform Experience
· Experience using Snowflake as an enterprise analytics platform.
- Understanding of modern ELT/ETL and cloud data architecture concepts.
- Familiarity with data governance, lineage, and access control best practices.
Leadership & Collaboration
· Acts as a bridge between business stakeholders and technical teams.
- Comfortable challenging requirements using business and data context.
- Mentors engineers and analysts on Salesforce data nuances and business meaning.
- Strong communicator able to explain complex Salesforce data behavior to non‑technical leaders.
Thanks,
Ampera Talent Team
· Identifies and defines the objective of an assigned marketing research project; determines the best methods to use to meet those objectives.
· Create research summaries in multiple formats, including spreadsheets, PowerPoint presentations, and written summaries.
· Summarizes and analyses data; makes recommendations related to research findings.
· Recruit the participants and assist and complete the project surveys on their behalf.
· Prepare findings and update databases to include newfound information, and create a summary of that analysis to pass on to the project manager.
· Assist other researchers with various tasks, including data entry, sample care and storage, field research etc.
· Perform internet searches to gather relevant information, and record any findings.
Location: Bangalore / Pune (Remote)
Employment type: Full time
Permanent website: www.amazech.com
Qualifications:
B.E./B.Tech/M.E./M.Tech in Computer Science, Information Technology, Electrical or Electronic with good academic background.
Experience and Required Skill Sets:
· Minimum 5 years of hands on experience with Azure Data Lake, Azure Data Factory, SQL Data Warehouse,Azure Blob, Azure Storage Explorer
· Experience in Data warehouse/analytical systems using Azure Synapse.
· Proficient in creating Azure Data Factory pipelines for ETL processing; copy activity, custom Azure development etc.
· Knowledge of Azure Data Catalog, Event Grid, Service Bus, SQL, Purview and Synapse
· Good technical knowledge in Microsoft SQL Server BI Suite (ETL, Reporting, Analytics, Dashboards) using SSIS, SSAS, SSRS, Power BI
· Design and develop batch and real-time streaming of data loads to data warehouse systems
Other Requirements:
· A Bachelor's or Master's degree (Engineering or computer related degree preferred)
· Strong understanding of Software Development Life Cycles including Agile/Scrum
Responsibilities:
· Ability to create complex, enterprise-transforming applications that meet and exceed client expectations.
· Responsible for bottom line. Strong project management abilities. Ability to encourage team to stick to timelines.
· Should demonstrate strong client interfacing capabilities like emails and calls with clients in the US/UK
Why LiftOff?
We at LiftOff specialize in product creation, for our main forte lies in helping Entrepreneurs realize their dream. We have helped businesses and entrepreneurs launch more than 70 plus products.
Many on the team are serial entrepreneurs with a history of successful exits.
As a Data Engineer, you will work directly with our founders and alongside our engineers on a variety of software projects covering various languages, frameworks, and application architectures.
About the Role
If you’re driven by the passion to build something great from scratch, a desire to innovate, and a commitment to achieve excellence in your craft, LiftOff is a great place for you.
- Architecture/design / configure the data ingestion pipeline for data received from 3rd party vendors
- Data loading should be configured with ease/flexibility for adding new data sources & also refresh of the previously loaded data
- Design & implement a consumer graph, that provides an efficient means to query the data via email, phone, and address information (using any one of the fields or combination)
- Expose the consumer graph/search capability for consumption by our middleware APIs, which would be shown in the portal
- Design / review the current client-specific data storage, which is kept as a copy of the consumer master data for easier retrieval/query for subsequent usage
Please Note that this is for a Consultant Role
Candidates who are okay with freelancing/Part-time can apply
- Build pixel-perfect, buttery smooth UIs across both mobile platforms.
- Leverage native APIs for deep integrations with both platforms.
- Diagnose and fix bugs and performance bottlenecks for performance that feels native.
- Reach out to the open-source community to encourage and help implement mission-critical software fixes—React Native moves fast and often breaks things.
- Maintain code and write automated tests to ensure the product is of the highest quality.
- Transition existing React web apps to React Native.
DevOps Engineer responsibilities include deploying product updates, identifying production issues, and implementing integrations that meet customer needs. If you have a solid background in working with cloud technologies, set up efficient deployment processes, and are motivated to work with diverse and talented teams, we’d like to meet you.
Ultimately, you will execute and automate operational processes fast, accurately, and securely.
Skills and Experience
-
4+ years of experience in building infrastructure experience with Cloud Providers ( AWS, Azure, GCP)
-
Experience in deploying containerized applications build on NodeJS/PHP/Python to kubernetes cluster.
-
Experience in monitoring production workload with relevant metrics and dashboards.
-
Experience in writing automation scripts using Shell, Python, Terraform, etc.
-
Experience in following security practices while setting up the infrastructure.
-
Self-motivated, able, and willing to help where help is needed
-
Able to build relationships, be culturally sensitive, have goal alignment, have learning agility
Roles and Responsibilities
-
Manage various resources across different cloud providers. (Azure, AWS, and GCP)
-
Monitor and optimize infrastructure cost.
-
Manage various kubernetes clusters with appropriate monitoring and alerting setup.
-
Build CI/CD pipelines to orchestrate provisioning and deployment of various services into kubernetes infrastructure.
-
Work closely with the development team on upcoming features to determine the correct infrastructure and related tools.
-
Assist the support team with escalated customer issues.
-
Develop, improve, and thoroughly document operational practices and procedures.
-
Responsible for setting up good security practices across various clouds.
JD:
Strong proficiency in JavaScript, including DOM manipulation and the JavaScript object model
Thorough understanding of React.js and its core principles
Experience with popular React.js workflows (such as Flux or Redux)
Familiarity with newer specifications of EcmaScript
Knowledge of isomorphic React is a plus
Familiarity with RESTful APIs
Knowledge of modern authorization mechanisms, such as JSON Web Token
Experience with common front-end development tools such as Babel, Webpack, NPM, etc.
Ability to understand business requirements and translate them into technical requirements
A knack for benchmarking and optimization
Familiarity with code versioning tools such as Git












