Innefu Labs: Seeking Product Head - Data AnalyticsInnefu Labs is a Software Product Company started with an aim to address need for the innovative products addressing security gaps and data analysis. The company focuses on developing specialized products and services in the field of Cyber security and Big Data Analytics. The Company has developed Auth Shield, a multifactor unique unified and application independent authentication solution to strengthen cyber security for critical applications and provides safety from phishing attacks and identity thefts. Another prominent product developed by the company is Prophecy, an analytics framework with multiple modules based on big data analytics, AI, natural language comprehension, predictive modeling & smarter analytics. The various modules are used for smart policing, security & surveillance, Investigative Research, cyber threat intelligence, KYC & fraud detection, criminal identity etc. Mainly used by Defence & Law Enforcement Agencies, the framework aims to strengthen the security & surveillance system of the Country. Innefu Labs is dedicated to development & implementation of more such products which can help strengthen the cyber security and surveillance system across the globe. Innefu is among the top 5 Cyber excellence companies globally (Holger Schulze awards) and have won similar recognitions at National and International arenas. To Innefu's credit, more than 100 customers in India already trust the company and the list is only growing. Exciting things are happening at Innefu and we are all game for a big leap forward. In light of this tall mandate, the Company is in the process of strengthening the team. We invite applications from interested candidates with high levels of passion, competence and commitment looking to build a rewarding career. You can read more about us @ www.innefu.com Designation Product Head - Data Analytics Location Delhi No. of Positions 1Qualifications - Post-Graduation - B.Tech/ BCA/ MCA Functional Skills & Experience - Overall work experience of 10+ years in product management.Roles and Responsibilities Product Management- Study of competing product with comparison sheet- Finalizing and getting approved specifications for next version- Managing development of frozen features- Adopt best practices to reduce Bugs- Project Updates with leadership and ClientsDeployments & Client Management- Make documents/SOP for Deployment Roadmap- Visiting / Touching base with clients- Achieve average CSAT score- Client testimonials- Client Demos and TrainingsSales: - Coordinating with Sales team for Revenue Generation- Verify leads & validate pipeline - Includes PoC(s)- Sales Closure- Vertical P&L management and EBIDTA above 50% Marketing: - Webinar (delivered or managed)- Partner enablement session- Reports, Blogs, whitepaper, Brochure, Flyers, Presentations, Proposals and Video ScriptOther Responsibilities : - Attend seminar- Mail campaigns- Team building Exercise- Sales Training to team and partnersTechnical Knowledge : Big data, Machine learning, Deep Learning, Artificial Intelligence, Data Warehousing, Data Analytics, Natural Language ProcessingCompensation 18 LPA - 25 LPAKey Attributes - Excellent qualitative and quantitative analytical skills- Strong interpersonal and multi-tasking skills- Ability to work independently as well as in team- Interest and ability to take up new responsibilities and challenges- Adaptability to a start-up culture (unstructured, non-hierarchical, no strict silos on work profile, on-job learning)Functional Knowledge :- Natural Language Processing based Text Analysis- Thorough understanding of Unstructured and Structured Data- Conceptual knowledge of Facial Recognition and Object Detection from imagesDelhi Office:783, Agarwal Cyber Plaza 2Netaji Subhash Palace, PitampuraDelhi - 110034
Job role:As a data analyst, you will be responsible for compiling actionable insights from data and assisting program, sales and marketing managers build data-driven processes. Your role will involve driving initiatives to optimize for operational excellence and revenue.Job Location: Indore | Full-Time Internship | Stipend - Performance Based | About the company:Anaxee Digital Runners is building India's largest last-mile verification & data collection network of Digital Runners (shared feet-on-street, tech-enabled) to help Businesses & Consumers reach remotest parts of India, on-demand. KYC | Field Verification | Data Collection | eSign | Tier-2, 3 & 4Sounds like a moonshot? It is. We want to make REACH across India (remotest places), as easy as ordering pizza, on-demand. Already serving 11000 pin codes (57% of India) | Website: www.anaxee.comImportant: Check out our company pitch (6 min video) to understand this goal - https://www.youtube.com/watch?v=7QnyJsKedz8Responsibilities:Ensure that data flows smoothly from source to destination so that it can be processedUtilize strong database skills to work with large, complex data sets to extract insightsFilter and cleanse unstructured (or ambiguous) data into usable data sets that can be analyzed to extract insights and improve business processesIdentify new internal and external data sources to support analytics initiatives and work with appropriate partners to absorb the data into new or existing data infrastructureBuild tools for automating repetitive tasks so that bandwidth can be freed for analyticsCollaborate with program managers and business analysts to help them come up with actionable, high-impact insights across product lines and functionsWork closely with top management to prioritize information and analytic needsRequirements:Bachelors or Masters (Pursuing or Graduated) in a quantitative field (such as Engineering, Statistics, Math, Economics, or Computer Science with Modeling/Data Science), preferably with work experience of over [X] years.Ability to program in any high-level language is required. Familiarity with R and statistical packages are preferred.Proven problem solving and debugging skills.Familiar with database technologies and tools (SQL/R/SAS/JMP etc.), data warehousing, transformation, and processing. Work experience with real data for customer insights, business, and market analysis will be advantageous.Experience with text analytics, data mining and social media analytics.Statistical knowledge in standard techniques: Logistic Regression, Classification models, Cluster Analysis, Neural Networks, Random Forests, Ensembles, etc.
Play an active role to support the operations to ensure growth of the companyMaintain MIS for CampaignsHandle data on funds transferedGenerate reports as required.Provide end to end solutions to customers from Social media leadsCoordinate with internal departments for Finance & Tech support.Facilitate smooth functioning of fundraisers & campaignsWork effectively & actively in a team.
We are looking for a technical architect to lead our data and analytics team. You wil be working with the engineering, designer and product team to help us track the application usage, understand the data and help present the same to our customers in a way that make business sense to them. What you did in the past that makes you a great fit for our team: Exprience in data, analytics and visualization Proven capability in architecting solutions independently and designing applications with hands-on participation. Affinity for profiling and analyzing code to identify areas for improvement Strong technical foundation including knowledge different programming paradigms OOP/ Functional Programming/ Procedural Programming. Experience with containers and container scheduling and management platforms such as: Docker, rkt, Mesos, or Kubernetes Experience with SQL & No-Sql DB Experience with cloud-based infrastructure-as-a-service platforms: AWS/Google Compute Engine/Azure/Soft Layer/OpenStack etc. Strong functional/ systems design experience with enterprise level systems and ability to balance the long-term and short-term implications of individual design decisions. Experience with infrastructure automation, infrastructure as code, automated application deployment, monitoring/telemetry, logging, reporting/dashboarding and continuous delivery technologies Release and Environment Management, metadata and data migration, environment comparisons and version control. Excited by working in a fast-paced startup environment Able to occasionally travel to the USA headquarters in Redwood Shores, CA Provides mentoring and guidance to other team members, including new hires. What it is like to work at Simpplr / why will you join us? We’re a very people focused startup and we strive for an open and transparent culture. We value those able to push themselves and empower their fellow teammates, whilst keeping a relaxed working environment. Simpplr provides a competitive compensation package along with medical and accidental insurance. We believe in work-life integration and offer a flexible work environment. These benefits, coupled with an amazing team of individuals who believe in our mission and value openness, collaboration and teamwork, make Simpplr an incredible place to work. It’s really important to us to celebrate our successes, so we have lots of team meals and outings too. In short, we’ll make sure you love working at Simpplr. Check out our Glassdoor rating here: https://www.glassdoor.co.in/Reviews/Simpplr-Reviews-E1628815.htm Other Benefits Salary: Best in the industry Reward & recognition Medical and Accidental insurance Work from home Flexible Working hours Casual Dress Parties & Fun Flexible salary benefits Internet and Mobile reimbursement.
Looking for a high performer who is willing to go where not many have ventured in the technical world, wanting to challenge the status quo and making a difference in the HealthTech. Your aspirations and your capabilities are the only boundaries that you would have to grow and be an integral part of designing and developing a futuristic solution. Responsibilities: •Design solutions that are related to data streaming and data analysis with or without time series databases •Build secure, scalable and robust systems to fulfill the vision and mission of the organization •Take the ownership in designing the solutions covering the IoT (connectivity and communication), Gateway, Data Fabric, Data Management, Analytics and user friendly applications (mobile as well as web based) •Deliver POCs and end-to-end full enterprise grade IoT solutions •Effectively work across technologies with team members Qualifications The ideal candidate should possess the following.. Requirements: •Prior experience in building enterprise solutions with quick time to market reach •Experience in design, implementation, and deployment of IoT/M2M systems, and Enterprise IT solutions •Core Java/Scala expertise is must. •Experience or expertise with data handling, data streaming and data processing is much needed •Experience with Health Tech is a plus, but not required. •Strong experience in software and web services development including Java, Angular, REST •Knowledge of data storage, cloud computing, big-data analytics •Excellent communication skills across verbal, written, and presentation and experience of presenting to senior management •Experience with Data Streaming and Data Analysis, be it real time or near real time •Experience with any of Messaging, real-time processing frameworks, monitoring and storage technologies such as Netty, Kafka, Nifi, Storm, Spark, Kibana, MongoDB, cassandra, would be ideal •Experience with Time series databases would be an added advantage •Cloud experience: Experience with cloud architecture and solution development in any of the public clouds: Microsoft Azure or Google Cloud or AWS General Experience: •Bachelor’s degree in Computer Science or Engineering is ideal, but not a must. •Minimum 4 years of industry experience with 2+ years in delivering Enterprise solutions using above technologies •Innovative and willing to learn in a fast-paced environment •Action-oriented and able to work independently with clear priorities •We expect our developers to be really strong technically and have an educated opinion and ask questions about everything they do before they do it. •Our interview process is really hands on (data structures, algos, threading, collections).