About Nextalytics Software Services Pvt Ltd
Similar jobs
Job Description: Data Analyst
Position: Data Analyst
Location: Gurgaon
Experience Level: 1-3 Years
Employment Type: Full-Time
Role Overview
We are looking for a results-driven Data Analyst to join our team and support business decision-making through data insights and analytics. The ideal candidate will be highly proficient in Python, skilled in data visualization tools, and experienced in solving complex problems to drive measurable business outcomes such as revenue growth or cost reduction.
Key Responsibilities
Data Analysis and Insights:
Extract, clean, and analyze large datasets using Python to uncover trends and actionable insights.
Develop predictive models and conduct exploratory data analysis to support business growth and operational efficiency.
Business Impact:
Identify opportunities to increase revenue or reduce costs through data-driven strategies.
Collaborate with stakeholders to understand business challenges and provide analytics-driven solutions.
Data Visualization:
Build intuitive dashboards and reports using tools like Zoho Analytics, Looker Studio, or Tableau.
Present findings and insights clearly to both technical and non-technical stakeholders.
Problem-Solving:
Work on end-to-end problem-solving, from identifying issues to implementing data-backed solutions.
Continuously optimize processes through automation and advanced analytics techniques.
Collaboration and Reporting:
Work closely with teams across departments to understand data needs and deliver solutions tailored to their goals.
Provide ongoing reporting and insights to track key performance indicators (KPIs) and project outcomes.
Required Skills & Qualifications
Technical Expertise:
Strong proficiency in Python, including libraries such as Pandas, NumPy, Matplotlib, and Seaborn.
Hands-on experience with BI tools like Zoho Analytics, Looker Studio, or Tableau.
Analytical Skills:
Proven ability to analyze data to generate insights that drive decision-making.
Demonstrated success in addressing business challenges and achieving results such as revenue increment or cost reduction.
Problem-Solving:
Experience working on real-world business problems, identifying root causes, and implementing data-based solutions.
Communication:
Strong ability to communicate complex insights effectively to diverse audiences.
Excellent presentation and storytelling skills to translate data into actionable business strategies.
Preferred Qualifications
Bachelor’s or Master’s degree in Data Science, Statistics, Computer Science, or related field.
Certifications in data analytics tools or platforms (e.g., Tableau, Looker).
Experience with advanced analytics or machine learning concepts.
What We Offer
Opportunity to work on impactful projects that directly influence business outcomes.
Collaborative, innovative, and supportive work environment.
Access to cutting-edge tools and technologies.
Competitive salary and growth opportunities.
We’re looking for a Technical Lead to join our early engineering team and lead our product development. If you’re someone who thrives on high ownership, can figure stuff out on your own and wants to be part of the zero-to-one journey, this might be for you.
What you’ll do
- Designing for scale: We have complex systems working overtime with a lot of moving pieces. It’s just going to get worse. We need to reimagine scalable & reusable LEGOs 🧩;
- Setup DevOps: Rome was not built without discipline, and we’re not aiming for anything less. We’ve got a lot of ground to cover and we need your help 🛠;
- Building the dream team: We need a Phil Jackson to get us a Rodman for our defence. A Dhoni to make our Rohit open. It’s going to be tough finding the right people and tougher putting them to the right work 🤾♂️;
- Prioritise and delegate tasks: What tech debt to fix while also building for current business requirements? How to get it all done in time using limited resources? 🧐;
- And after all of it of-course, be ready to jam to this every time we push to prod - https://www.youtube.com/watch?v=h9QNUcrjtOs🔥;
Our Way Of Life
Over time we’ve realised that while we’re super excited about shaping the future of commerce, a big part of why people join us and stick with us is because they resonate with our way of life. You could call it work culture - but it ends up becoming more than just that.
It’s taken us time to discover and articulate what our culture feels like, this evolving document is an attempt to candidly share what it’s like working at BiteSpeed.
Our Purpose
At BiteSpeed, work is personal. You could blame this on us being existential, but most of us are spending the best years of our lives doing this and we want to be purposeful about the kind of workplace we’re trying to create. Our purpose is about why we’re here and what we care about:-
Personal Transformation
- We like to think of BiteSpeed as being a gym for our careers. It’s where we come to do great work we can be proud of and push ourselves in the pursuit of excellence. Is it comfortable? No. Is it painful? Sometimes. Is it fulfilling? Yes. We were never the company that was supposed to win - We started out of a dorm room, solo founder, early engineers who hadn’t written code in their lives setting out to build a global SaaS company. Our roots are in doing things we are unqualified for and we bet on people who want that journey for themselves. There are stories of people across the company from a 20 year old who’s never done sales closing enterprise deals to a college intern owning an enterprise product lifecycle - these are the stories we are proud of. If someone can look back after 2 years of working at BiteSpeed and say they don’t identify with who they were, we’d call it a success and we want to help them get there.
Wealth Creation
- Somehow most companies are shy about wanting to make money. It’s looked at as this thing which everyone does but no one really talks about. We’re not ashamed of doing it for the money. Wealth unlocks choice. If life is an amusement park, we think there is value to getting an unlimited rides pass. It's purposeful for us to try to create wealth that allows people to achieve their life's dreams - whether that's owning a house or booking a dream vacation for their parents. We do this by ensuring everyone in the team gets equity and there are generous cash & equity bumps on a frequent basis to reward performance and alignment in values.
Winning Together
- We’re not here for a participation certificate. We’re playing to win. The keyword here is ‘together’. Winning ‘together’ is about recognising it’s a team sport. We don't care about man of the match awards, either we win the trophy or we don’t. There is a certain camaraderie that comes with winning together that’s hard to explain, but it’s deeply fulfilling and energising. The question we ask ourselves is - can we play the game like it’s never been played before?
Our Values
Our values are about how we do what we do. Values define the right thing to do. We hire, reward and sometimes have to let go based on our values. We have 5 core values:-
- Go Above And Beyond - We value people who care about doing a good job. Going above and beyond is about doing more than the bare minimum that gets the job done and raising the bar each time we have the opportunity to do so
- Making Things Happen - Each company has an operating rhythm and this is that for us. Making things happen is the opposite of being passive. It’s about high agency, about always believing there is a way to get what we want and either finding the way or making the way
- Say It Like It Is - We are candid and direct when it comes to sharing feedback, transparent with our numbers and intellectually honest about the realities of any business situation
- Progress Over Perfection - We’re not building rockets. We care about moving fast and iterating towards perfection. We like to take a minimum viable approach to prioritisation and problem solving and actively look for 80/20 solutions
- Don’t Take Yourself Seriously, Take Your Work Seriously - Great things are built when people can contribute to pursuits beyond themselves. Being low ego, not needing praise to do a good job, taking feedback with humility, being self-critical all add up to this
Perks & Benefits
Small things we’ve done to ensure we take care of our wellness, learning & keep things fun:-
- Health Insurance - Health insurance cover and accident coverage for extra cushion and mental peace when rainy days hit us
- Work From Home Budget - Your gadget quirks are taken care of with our WFH budget, whether its a standing desk to burn those extra calories or showing off your cool desk setup, we’ve got you covered
- Quarterly Off-sites - Quarterly off-sites are a core part of the BiteSpeed culture. Our off-sites range from intense quarter planning sessions to crazy mafia nights and competitive cricket matches (with a lot of trash talking)
- Cult Fitness Membership - All work and no play makes jack a dull boy. Cult Fit and Cult Play passes to make sure we hit the gym more often
- Personal Development - We sponsor courses, conference tickets, books on a case to case basis to ensure we’re constantly growing
- Unlimited Leaves - We trust people to make good decisions on when they need a break and for how long (leave management systems are expensive, but that’s a separate discussion)
- Salary In Advance - Trust first, by default. We pay out salaries in the first week of the month
Experience managing agile development teams. You will be responsible for
-
Defining the technical specifications and architecture along with overall technical product infrastructure.
-
Provide creative and unique solutions to accommodate for the versatility and customizations during product development.
-
The role would expect you to code and/or conduct code reviews so experience in Django or similar languages would be useful.
-
Managing a team of engineers and developers; ensuring quality is met at every process of the development and deployment phases.
-
Recruitment and building a team of a good mix of talent would be necessary.
-
Managing AWS servers to track/monitor/optimize resource utilization and upgrade servers when necessary.
- Extensive engineering expertise in the areas of tooling, specifications, product maintenance, budgeting and software products.
- Ability to understand and collaborate on complete product lifecycle from concept to execution.
- Implementation of new technologies to optimize product quality and usage.
Talented C++ Developer who has experience in design, development, debugging of multi-threaded large scale application with good understanding in data structures on Linux packaging, functional testing and deployment automation very good with problem solving.
Key responsibilities :
- Understand fundamental design principles and best practices for developing backend servers and web applications
- Gather requirements, scope functionality, estimate and translate those requirements into solutions
- Implement and integrate software features as per requirements
- Deliver across the entire app life cycle
- Work in a product creation project and/or technology project with implementation or integration responsibilities
- Improve an existing code base, if required, and ability to read source code to understand data flow and origin
- Design effective data storage for the task at hand and know how to optimize query performance along the way
- Follow an agile methodology of development and delivery
- Strictly adhere to coding standards and internal practices; must be able to conduct review code
- Mentor and possibly lead junior developers
- Contribute towards innovation
- Performance optimization of apps
- Explain technologies and solutions to technical and non-technical stakeholders
- Diagnose bugs and other issues in products
- Continuously discover, evaluate, and implement new technologies to maximize development efficiency
Must have / Good to have:
- 5-7years' experience with C++ development and relevant 3+yrs in modern version 11/14/17 would be a plus.
- Design and implementation of high-availability, and performance applications on Linux environment
- Advanced knowledge of C/C++, Object Oriented Design, STL
- Good with multithreading and data structures
- Develop back-end components to improve responsiveness and overall performance
- Familiarity with database design, integration with applications and python packaging.
- Familiarity with front-end technologies (like JavaScript and HTML5), REST API, security considerations
- Familiarity with functional testing and deployment automation frameworks
- Experience in development for 3-4 production ready application using C++ as programming language
- Experience in writing unit test cases including positive and negative test cases
- Experience of CI/CD pipeline code deployment (Git, SVN, Jenkins or Teamcity)
- Experience with Agile and DevOps methodology
- Very good problem-solving skills
- Experience with Web technologies is a plus.
Read less
Senior Software Engineer (FullStack)
As an experienced engineer we know that you have built software to solve various business problems at your previous workplaces. You may have also explored technologies on your own for your learning or hobby projects.
You will be building APIs for the Synup platform and also UI to make our platform capabilities available to our customers.
You and the team that you are a part of will be collectively responsible building performant software and customer experiences that scale to our next million customers. You will be responsible for writing technical specs and contributing to it's implementation. We expect that you would have done the same in your previous workplaces.
Other folks on our team are looking forward to learn from your experiences.
For engineers that join our team
We expect you to be good with Ruby or Python to build APIs.
You will be contributing to our UI that is built with React and GraphQL.
We hope our team members have a strong grasp of software design patterns and know when to put them to good use.
Experience with an SQL datastore would help a lot. PostgreSQL is our primary datastore. We optimize our search functionality and rollup reports by using ElasticSearch
We expect that you have used Redis. Redis is our swiss army knife to solve a lot of problems apart from just caching.
Principal Software Engineer /Architect
Axtechnosoft Private Limited
Job Description
Responsibilities: -
- You would take ownership of the existing system and scale it more than 10X over the next 2 years.
- Apply best coding standards.
- You would create the infrastructure that can serve 100s of customers and millions of data requests per hour.
- Over the next year or so, you would be able to guide a team of 5 to 15 people to accomplish your goals. Mentoring this team into a world-class engineering team would be a key part of your role.
- Your earlier experience in successfully building, deploying and running complex, large scale web or data products.
- You would work hand-in-hand with the Product Management team to build engineering capabilities that align with the evolution of the product.
- Eventually work with Data science teamwork to ensure that the algorithmic intelligence that we build is plugged into the product in an expected manner.
- Overall, you would be responsible for end-to-end architecting from Engineering standpoint.
Must have: -
- Total experience of 8+ years while relevant experience of at least 2 years.
- Have built a platform that handles at least 500k to 1 million data request an hour.
- Worked on building an infrastructure that serves 200k+ customers.
- Hands on coder.
- Expert level knowledge in at least one technology stack - Python or ideally, Java. Also Angular, React, Node.js
- Expert level knowledge with Elastic Search or NoSQL technologies like MongoDB/HBase/Cassandra/Redis/Neo4j
- Experience developing web applications.
- Knowledge of multiple front-end languages and libraries (e.g. HTML/ CSS, JavaScript, XML, jQuery)
- Working knowledge of databases (e.g. MySQL, MongoDB), web servers (e.g. Apache) and UI/UX design.
- Devops experience working with AWS / Other cloud platforms.
- Strong knowledge of API’s.
- Excellent communication and teamwork skills
- Implementing Software Engineering best practices.
- Previously worked on user facing products with scale.
- Agile methodology.
- Great attention to detail.
- Organizational skills
- An analytical mind
Good to have: -
- Working knowledge of React Red.
- Open-source technology.
- Working knowledge of AI/ ML.
- Degree in Computer Science, Statistics or relevant field.
- Experience working in a start-up environment.
Key Skills
Python
Angular Javascript
Reactor & Solids Processing
Node.js
Elastic Search
NoSQL
Web Applications
Database
Web Servers
UX/UI Design
AWS Cloud
Agile Methodology
Job Responsibilities
Design and build from cloud-based products and services with massive scale and reliability Write clean and modular code primarily in Python to create multi tenant microservices Terabyte scale data per month with SLA end to end latency and tenant fairness
Build CICD based software development model with end-to-end ownership of code delivery - starting from design/architecture, coding, automated functional/integration testing and operating/monitoring the service in production.
Use relevant technologies and cloud services like Kafka, Redis, Mongo, RDS, Spark Streaming, Redshift, Airflow to build highly performant and scalable distributed systems
Design and develop data schema and access layer to optimally store and retrieve data
Stay up to date with the latest developments in cloud computing and incorporate relevant learnings to both product features and product architecture.
Preferred Qualifications
BS/Btech (Btech Preferred) in Computer Science, Computer Engineering, Information Technology
Preferred Technical Skills:
2- 6 years of software development experience with enterprise-grade software. Must have experience in building scalable, high-performance cloud services Expert coding skills in Python, Django
In Depth experience in AWS is mandatory
Expertise in building scalable event based asynchronous systems based on the microservices architecture
Experience working with docker and kubernetes
Experience with databases such as MongoDB, Redis, RDS, RDF, Graph DB, SPARQL etc. Experience with messaging technologies such as Kafka, Pulsar, SQS
Must have expertise in building REST APIs
Strong object-oriented designing and programming experience Experience in cloud object stores: S3, Cloud Storage, Blobs, etc. Desired Technical Skills:
Open source committer in related areas like cloud technologies, kubernetes, database etc Additional Skills
Great written and verbal communication
Ability to work geo distributed cross functional group
Be Part Of Building The Future
Dremio is the Data Lake Engine company. Our mission is to reshape the world of analytics to deliver on the promise of data with a fundamentally new architecture, purpose-built for the exploding trend towards cloud data lake storage such as AWS S3 and Microsoft ADLS. We dramatically reduce and even eliminate the need for the complex and expensive workarounds that have been in use for decades, such as data warehouses (whether on-premise or cloud-native), structural data prep, ETL, cubes, and extracts. We do this by enabling lightning-fast queries directly against data lake storage, combined with full self-service for data users and full governance and control for IT. The results for enterprises are extremely compelling: 100X faster time to insight; 10X greater efficiency; zero data copies; and game-changing simplicity. And equally compelling is the market opportunity for Dremio, as we are well on our way to disrupting a $25BN+ market.
About the Role
The Dremio India team owns the DataLake Engine along with Cloud Infrastructure and services that power it. With focus on next generation data analytics supporting modern table formats like Iceberg, Deltalake, and open source initiatives such as Apache Arrow, Project Nessie and hybrid-cloud infrastructure, this team provides various opportunities to learn, deliver, and grow in career. We are looking for innovative minds with experience in leading and building high quality distributed systems at massive scale and solving complex problems.
Responsibilities & ownership
- Lead, build, deliver and ensure customer success of next-generation features related to scalability, reliability, robustness, usability, security, and performance of the product.
- Work on distributed systems for data processing with efficient protocols and communication, locking and consensus, schedulers, resource management, low latency access to distributed storage, auto scaling, and self healing.
- Understand and reason about concurrency and parallelization to deliver scalability and performance in a multithreaded and distributed environment.
- Lead the team to solve complex and unknown problems
- Solve technical problems and customer issues with technical expertise
- Design and deliver architectures that run optimally on public clouds like GCP, AWS, and Azure
- Mentor other team members for high quality and design
- Collaborate with Product Management to deliver on customer requirements and innovation
- Collaborate with Support and field teams to ensure that customers are successful with Dremio
Requirements
- B.S./M.S/Equivalent in Computer Science or a related technical field or equivalent experience
- Fluency in Java/C++ with 8+ years of experience developing production-level software
- Strong foundation in data structures, algorithms, multi-threaded and asynchronous programming models, and their use in developing distributed and scalable systems
- 5+ years experience in developing complex and scalable distributed systems and delivering, deploying, and managing microservices successfully
- Hands-on experience in query processing or optimization, distributed systems, concurrency control, data replication, code generation, networking, and storage systems
- Passion for quality, zero downtime upgrades, availability, resiliency, and uptime of the platform
- Passion for learning and delivering using latest technologies
- Ability to solve ambiguous, unexplored, and cross-team problems effectively
- Hands on experience of working projects on AWS, Azure, and Google Cloud Platform
- Experience with containers and Kubernetes for orchestration and container management in private and public clouds (AWS, Azure, and Google Cloud)
- Understanding of distributed file systems such as S3, ADLS, or HDFS
- Excellent communication skills and affinity for collaboration and teamwork
- Ability to work individually and collaboratively with other team members
- Ability to scope and plan solution for big problems and mentors others on the same
- Interested and motivated to be part of a fast-moving startup with a fun and accomplished team
Your mission is to help lead team towards creating solutions that improve the way our business is run. Your knowledge of design, development, coding, testing and application programming will help your team raise their game, meeting your standards, as well as satisfying both business and functional requirements. Your expertise in various technology domains will be counted on to set strategic direction and solve complex and mission critical problems, internally and externally. Your quest to embracing leading-edge technologies and methodologies inspires your team to follow suit.
Responsibilities and Duties :
- As a Data Engineer you will be responsible for the development of data pipelines for numerous applications handling all kinds of data like structured, semi-structured &
unstructured. Having big data knowledge specially in Spark & Hive is highly preferred.
- Work in team and provide proactive technical oversight, advice development teams fostering re-use, design for scale, stability, and operational efficiency of data/analytical solutions
Education level :
- Bachelor's degree in Computer Science or equivalent
Experience :
- Minimum 5+ years relevant experience working on production grade projects experience in hands on, end to end software development
- Expertise in application, data and infrastructure architecture disciplines
- Expert designing data integrations using ETL and other data integration patterns
- Advanced knowledge of architecture, design and business processes
Proficiency in :
- Modern programming languages like Java, Python, Scala
- Big Data technologies Hadoop, Spark, HIVE, Kafka
- Writing decently optimized SQL queries
- Orchestration and deployment tools like Airflow & Jenkins for CI/CD (Optional)
- Responsible for design and development of integration solutions with Hadoop/HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions
- Knowledge of system development lifecycle methodologies, such as waterfall and AGILE.
- An understanding of data architecture and modeling practices and concepts including entity-relationship diagrams, normalization, abstraction, denormalization, dimensional
modeling, and Meta data modeling practices.
- Experience generating physical data models and the associated DDL from logical data models.
- Experience developing data models for operational, transactional, and operational reporting, including the development of or interfacing with data analysis, data mapping,
and data rationalization artifacts.
- Experience enforcing data modeling standards and procedures.
- Knowledge of web technologies, application programming languages, OLTP/OLAP technologies, data strategy disciplines, relational databases, data warehouse development and Big Data solutions.
- Ability to work collaboratively in teams and develop meaningful relationships to achieve common goals
Skills :
Must Know :
- Core big-data concepts
- Spark - PySpark/Scala
- Data integration tool like Pentaho, Nifi, SSIS, etc (at least 1)
- Handling of various file formats
- Cloud platform - AWS/Azure/GCP
- Orchestration tool - Airflow