InfluxDB Jobs in Bangalore (Bengaluru)
- Architect and implement modules for ingesting, storing and manipulating large data sets for a variety of cybersecurity use-cases.
- Write code to provide backend support for data-driven UI widgets, web dashboards, workflows, search and API connectors.
- Design and implement high performance APIs between our frontend and backend components, and between different backend components.
- Build production quality solutions that balance complexity and performance
- Participate in the engineering life-cycle at Balbix, including designing high quality UI components, writing production code, conducting code reviews and working alongside our backend infrastructure and reliability teams
- Stay current on the ever-evolving technology landscape of web based UIs and recommend new systems for incorporation in our technology stack.
- Product-focused and passionate about building truly usable systems
- Collaborative and comfortable working with across teams including data engineering, front end, product management, and DevOps
- Responsible and like to take ownership of challenging problems
- A good communicator, and facilitate teamwork via good documentation practices
- Comfortable with ambiguity and able to iterate quickly in response to an evolving understanding of customer needs
- Curious about the world and your profession, and a constant learner
- BS in Computer Science or related field
- Atleast 3+ years of experience in the backend web stack (Node.js, MongoDB, Redis, Elastic Search, Postgres, Java, Python, Docker, Kubernetes, etc.)
- SQL, no-SQL database experience
- Experience building API (development experience using GraphQL is a plus)
- Familiarity with issues of web performance, availability, scalability, reliability, and maintainability
Designation – Deputy Manager - TS
- Total of 8/9 years of development experience Data Engineering . B1/BII role
- Minimum of 4/5 years in AWS Data Integrations and should be very good on Data modelling skills.
- Should be very proficient in end to end AWS Data solution design, that not only includes strong data ingestion, integrations (both Data @ rest and Data in Motion) skills but also complete DevOps knowledge.
- Should have experience in delivering at least 4 Data Warehouse or Data Lake Solutions on AWS.
- Should be very strong experience on Glue, Lambda, Data Pipeline, Step functions, RDS, CloudFormation etc.
- Strong Python skill .
- Should be an expert in Cloud design principles, Performance tuning and cost modelling. AWS certifications will have an added advantage
- Should be a team player with Excellent communication and should be able to manage his work independently with minimal or no supervision.
- Life Science & Healthcare domain background will be a plus
The company is World's No1 Global management consulting firm.
Graduate or post graduate degree in statistics, economics, econometrics, computer science,
engineering, or mathematics
2-5 years of relevant experience
Adept in forecasting, regression analysis and segmentation work
Understanding of modeling techniques, specifically logistic regression, linear regression, cluster
analysis, CHAID, etc.
Statistical programming software experience in R & Python, comfortable working with large data
sets; SAS & SQL are also preferred
Excellent analytical and problem-solving skills, including the ability to disaggregate issues, identify
root causes and recommend solutions
Excellent time management skills
Good written and verbal communication skills; understanding of both written and spoken English
Strong interpersonal skills
Ability to act autonomously, bringing structure and organization to work
Creative and action-oriented mindset
Ability to interact in a fluid, demanding and unstructured environment where priorities evolve
constantly and methodologies are regularly challenged
Ability to work under pressure and deliver on tight deadlines
closely with the Kinara management team to investigate strategically important business
Lead a team through the entire analytical and machine learning model life cycle:
Define the problem statement
Build and clean datasets
Exploratory data analysis
Apply ML algorithms and assess the performance
Code for deployment
Code testing and troubleshooting
Communicate Analysis to Stakeholders
Manage Data Analysts and Data Scientists
Vahak (https://www.vahak.in) is India’s largest & most trusted online transport marketplace & directory for road transport businesses and individual commercial vehicle (Trucks, Trailers, Containers, Hyva, LCVs) owners for online truck and load booking, transport business branding and transport business network expansion. Lorry owners can find intercity and intracity loads from all over India and connect with other businesses to find trusted transporters and best deals in the Indian logistics services market. With the Vahak app, users can book loads and lorries from a live transport marketplace with over 5 Lakh + Transporters and Lorry owners in over 10,000+ locations for daily transport requirements.
Vahak has raised a capital of $5 Million in a “Pre Series A” round from RTP Global along with participation from Luxor Capital and Leo Capital. The other marquee angel investors include Kunal Shah, Founder and CEO, CRED; Jitendra Gupta, Founder and CEO, Jupiter; Vidit Aatrey and Sanjeev Barnwal, Co-founders, Meesho; Mohd Farid, Co-founder, Sharechat; Amrish Rau, CEO, Pine Labs; Harsimarbir Singh, Co-founder, Pristyn Care; Rohit and Kunal Bahl, Co-founders, Snapdeal; and Ravish Naresh, Co-founder and CEO, Khatabook.
As a Senior Product Analyst, you will play a key role in defining & growing our mobile & web applications. You will be collaborating with Product Managers, Designers, Growth Marketers, and Engineers & Customer Success Managers. You will provide data-backed insights, dashboards & visualization capabilities to help teams assess product & business successes/failures & make data driven decisions while planning and prioritizing.
Our ideal candidate will have strong data analysis & data visualization skills, led or managed analytics teams, and have understanding of data setup/infrastructures. We love people who are humble and collaborative with hunger for excellence.
Improve Dashboards & Data Visualization Capabilities:
- Create & manage dashboards on various mobile & web tools like Clevertap, Branch, Google Analytics, Zoho Page Sense, Firebase and other data tools that marketing and product teams use on a daily basis.
- Create richer & holistic (collating multiple data sources/platforms) dashboards and reports on Tableau & Metabase for Product, Marketing & Business teams.
- Improve dashboards & setup alert systems on Sentry, Firebase, GA for product & technology teams to timely catch fatal crashes, errors.
Deliver Data Backed Insights & Periodic Analysis Reports:
- Perform deep data-analysis to identify issues, trends, opportunities and improvements possible in Acquisition, Activation, Transactions, Conversion Ratios & other product-driving metrics. Present findings to product, marketing and business teams.
- Perform analysis to assess the success of ongoing product initiatives, provide insights to product managers & designers to iterate product features and flows.
- Create & track product and marketing funnels to identify growth opportunities and problem areas. Collaborate with PMs and designers to discover underlying reasons for drop points. Analyze Funnel, Cohort, Trends, LTV, DAU, MAU, Retention & user behaviour using data & help teams make data-driven decisions.
Define Measurement & Success Criteria for Experiments, New Features and A/B Tests:
- Collaborate with Product Manager to design, execute and analyze A/B tests, define target KPIs & control segments.
- Help Product Managers define success criteria and measurement standards for new features and product offerings.
- Ensure technical setup is there to MONITOR important KPIs and collect custom events data. Define custom events & properties to be tracked on Clevertap, GA, Zoho Page Sense.
Drive increased engagement & retention using Clevertap:
- Define AARRR KPIs and Optimize them with CT multi-channel journeys by continuously performing A/B tests.
- Create detailed and diverse user-segments to identify high, medium & low activity cohorts. Create strategies to increase the size of high-activity cohorts.
- Analyze website & mobile traffic, create user journeys & campaigns to enable cross-channel product experience and marketing.
Manage, Mentor & lead Product Analytics:
- Act as an Analytics Subject Matter expert in the team. Train, hire and manage a team of analysts. Setup collaborative processes & management practices to help teams achieve important analytics goals.
- Assist the Product stakeholders make data driven decisions to achieve short and long-term goals relating to product growth.
- Collaborate & hire analysts to improve the overall product analysis capabilities.
- Research and be updated with upcoming technologies and trends in digital product analytics.
- Bachelor’s Degree in Computer Science, Engineering or any other related field.
- 4+ years of experience in mobile/web application analytics, business analytics, must have led or managed a team before.
- In-depth knowledge & hands on experience of digital analytics software like Zoho Page Sense, Google Analytics, CleverTap, Branch, Firebase, Facebook Analytics.
- Expertise in dash-boarding tools like Tableau, Metabase and Excel.
- Experience with data warehousing concepts and ETL development and tools
- Fetching and manipulating large data sets using SQL / Hive.
- Analysing and visualizing large data sets: R / python.
- Ability to prioritize and manage multiple milestones and projects efficiently.
- Team spirit, good time-management skills, strong communication skills to collaborate with various stakeholders.
- Detail oriented, advanced problem solving skills with customer-centric approach.
- Preliminary Interview With the Product Head
- Technical Interview
- Operational Interview With The Product Head
- Final Interview With The CEO
Must Have Skills:
• Good experience in Pyspark - Including Dataframe core functions and Spark SQL
• Good experience in SQL DBs - Be able to write queries including fair complexity.
• Should have excellent experience in Big Data programming for data transformation and aggregations
• Good at ELT architecture. Business rules processing and data extraction from Data Lake into data streams for business consumption.
• Good customer communication.
• Good Analytical skills
- Ensure and own Data integrity across distributed systems.
- Extract, Transform and Load data from multiple systems for reporting into BI platform.
- Create Data Sets and Data models to build intelligence upon.
- Develop and own various integration tools and data points.
- Hands-on development and/or design within the project in order to maintain timelines.
- Work closely with the Project manager to deliver on business requirements OTIF (on time in full)
- Understand the cross-functional business data points thoroughly and be SPOC for all data-related queries.
- Work with both Web Analytics and Backend Data analytics.
- Support the rest of the BI team in generating reports and analysis
- Quickly learn and use Bespoke & third party SaaS reporting tools with little documentation.
- Assist in presenting demos and preparing materials for Leadership.
- Strong experience in Datawarehouse modeling techniques and SQL queries
- A good understanding of designing, developing, deploying, and maintaining Power BI report solutions
- Ability to create KPIs, visualizations, reports, and dashboards based on business requirement
- Knowledge and experience in prototyping, designing, and requirement analysis
- Be able to implement row-level security on data and understand application security layer models in Power BI
- Proficiency in making DAX queries in Power BI desktop.
- Expertise in using advanced level calculations on data sets
- Experience in the Fintech domain and stakeholder management.
The Data Engineer would be responsible for selecting and integrating Big Data tools and frameworks required. Would implement Data Ingestion & ETL/ELT processes
Required Experience, Skills and Qualifications:
- Hands on experience on Big Data tools/technologies like Spark, Databricks, Map Reduce, Hive, HDFS.
- Expertise and excellent understanding of big data toolset such as Sqoop, Spark-streaming, Kafka, NiFi
- Proficiency in any of the programming language: Python/ Scala/ Java with 4+ years’ experience
- Experience in Cloud infrastructures like MS Azure, Data lake etc
- Good working knowledge in NoSQL DB (Mongo, HBase, Casandra)