Loading...

{{notif_text}}

The next CutShort event, {{next_cs_event.name}}, in partnership with UpGrad, will happen on {{next_cs_event.startDate | date: 'd MMMM'}}The next CutShort event, {{next_cs_event.name}}, in partnership with UpGrad, will begin in a few hoursThe CutShort event, {{next_cs_event.name}}, in partnership with UpGrad, is LIVE.Join now!
{{hours_remaining}}:{{minutes_remaining}}:{{seconds_remaining}}Want to save 90% of your recruiting time? Learn how in our next webinar on 22nd March at 3 pmLearn more

Big data Jobs

Explore top Big data Job opportunities for Top Companies & Startups. All jobs are added by verified employees who can be contacted directly below.

Technical Architect/CTO

Founded 1999
Products and services{{j_company_types[2 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Chennai, Bengaluru (Bangalore)
Experience icon
10 - 20+ years
Experience icon
20 - 70 lacs/annum

1. 10+ years experience in Java app Development 2. Full Stact Architect, capable of architecting from groundup with exp in more than two technologies. 3. Experience in Supporting Pre-sales activities, proposals drop your CV here or Share it to arunk6252@gmail.com

Job posted by
apply for job
apply for job
Arun Kumar picture
Arun Kumar
Job posted by
Arun Kumar picture
Arun Kumar
Apply for job
apply for job

Production Support Engineer

Founded 2012
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
3 - 10 years
Experience icon
6 - 20 lacs/annum

As a Senior Production Support Lead you will predominantly be involved in supporting high visibility applications including administrative platforms and reporting systems. You'll stretch your skills and grow your career as a leader. This is a unique role where your efforts will make an impact across different teams within the organization. The following points depict the important roles and responsibilities of a Senior Production Support Lead: Roles & Responsibilities: - Troubleshoot incidents as we encounter them. Figure out root causes of problem tickets and implement long-term, permanent fixes so that we do not encounter similar issues again and again. - Collaborate with Engineering and Sustaining teams so that we can design our systems for better resilience and maintainability. That way, we can pre-empt issues even before we encounter them. - Automate existing processes so that we gain an order of magnitude efficiency and effectiveness gains. - Build & manage a high-performance team of bright engineers. Lead by example through technical thought leadership and flawless execution. - Ensure troubleshooting skills and capability of the team is improved which can be quantified with the no of incidents that the team can resolve independently.- - Accountable and technical owner for ensuring OPS readiness for new modules that need to be supported from various angles like monitoring, adequate technical onboarding training and preparedness to handle incidents. - Drive various automation initiatives and ensure efficiency improvement by automation of manual/routine tasks/ SOP. Decrease turnaround times, streamline work processes and work cooperatively and jointly to provide quality seamless customer service. Develop, enhance and maintain various tools in this regard. - Take up current monitoring two notches higher with his self-expertize and ensure operations team to be able to detect all critical issues before the customer. - Provide management of On-call support for 24x7 coverage - Ensure systems stay running in a stable state and are meeting SLA requirements. - Set and maintain alerts within application monitoring software to ensure performance anomalies are reported immediately - Review new issue items (problems, incidents) assigned to the development group and ensure that sufficient information is available with each ticket to proceed with analysis and resolution - Assign tickets to team members or to another group when applicable - Coordinate business approval of production migrations - Facilitate the hand-off of new release functionality from the development team to the Production Support team Key Skills: - 3 to 10 years of experience in Application production support environment with an ability to solve complex problems. - Adept on Linux platform - Programming experience in PHP, Python - Exposure to Networking, load balancers, Messaging Queue and strong database fundamentals (MySQL, MsSQL, Cassandra), AWS, Kafka - Deep knowledge of Incident and problem management.

Job posted by
apply for job
apply for job
Madri Prasad picture
Madri Prasad
Job posted by
Madri Prasad picture
Madri Prasad
Apply for job
apply for job

Big Data Engineer

Founded 2015
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Navi Mumbai, Noida, NCR (Delhi | Gurgaon | Noida)
Experience icon
1 - 2 years
Experience icon
4 - 10 lacs/annum

Job Requirement Installation, configuration and administration of Big Data components (including Hadoop/Spark) for batch and real-time analytics and data hubs Capable of processing large sets of structured, semi-structured and unstructured data Able to assess business rules, collaborate with stakeholders and perform source-to-target data mapping, design and review. Familiar with data architecture for designing data ingestion pipeline design, Hadoop information architecture, data modeling and data mining, machine learning and advanced data processing Optional - Visual communicator ability to convert and present data in an easy comprehensible visualization using tools like D3.js, Tableau To enjoy being challenged, solve complex problems on a daily basis Proficient in executing efficient and robust ETL workflows To be able to work in teams and collaborate with others to clarify requirements To be able to tune Hadoop solutions to improve performance and end-user experience To have strong co-ordination and project management skills to handle complex projects Engineering background

Job posted by
apply for job
apply for job
Sneha Pandey picture
Sneha Pandey
Job posted by
Sneha Pandey picture
Sneha Pandey
Apply for job
apply for job

Big Data Developer

Founded 2017
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai
Experience icon
2 - 4 years
Experience icon
6 - 15 lacs/annum

Job Title: Software Developer – Big Data Responsibilities We are looking for a Big Data Developer who can drive innovation and take ownership and deliver results. • Understand business requirements from stakeholders • Build & own Mintifi Big Data applications • Be heavily involved in every step of the product development process, from ideation to implementation to release. • Design and build systems with automated instrumentation and monitoring • Write unit & integration tests • Collaborate with cross functional teams to validate and get feedback on the efficacy of results created by the big data applications. Use the feedback to improve the business logic • Proactive approach to turn ambiguous problem spaces into clear design solutions. Qualifications • Hands-on programming skills in Apache Spark using Java or Scala • Good understanding about Data Structures and Algorithms • Good understanding about relational and non-relational database concepts (MySQL, Hadoop, MongoDB) • Experience in Hadoop ecosystem components like YARN, Zookeeper would be a strong plus

Job posted by
apply for job
apply for job
Suchita Upadhyay picture
Suchita Upadhyay
Job posted by
Suchita Upadhyay picture
Suchita Upadhyay
Apply for job
apply for job

Cassandra Admin for IoT Platform

Founded 1989
Products and services{{j_company_types[2 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
2 - 9 years
Experience icon
12 - 25 lacs/annum

As a Cassandra Administrator, you’ll be responsible for the administration and governance of a complex analytics platform that is already changing the way large industrial companies manage their assets. A Cassandra Administrator understands cutting-edge tools and frameworks, and is able to determine what the best tools are for any given task. You will enable and work with our other developers to use cutting-edge technologies in the fields of distributed systems, data ingestion and mapping, and machine learning, to name a few. We also strongly encourage everyone to tinker with existing tools, and to stay up to date and test new technologies—all with the aim of ensuring that our existing systems don’t stagnate or deteriorate. Responsibilities: As a Cassandra Engineer, your responsibilities may include, but are not limited to, the following: • Build a scalable Cassandra Platform designed to serve many different use-cases and requirements • Build a highly scalable framework for ingesting, transforming and enhancing data at web scale • Establish automated build and deployment pipelines Implement machine learning models that enable customers to glean hidden insights about their data • Implementing security and integrating with components such as LDAP, AD, Sentry, Kerberos. • Strong understanding of row level and role based security concepts such as inheritance • Establishing scalability benchmarks for predictable scalability thresholds. Qualifications: • Bachelor's degree in Computer Science or related field • Experience with noSQL data stores: Cassandra, HDFS and/or Elasticsearch • 4+ years of system building experience • 2+ years of programming experience using Cassandra • A passion for DevOps and an appreciation for continuous integration/deployment • A passion for QA and an understanding that testing is not someone else’s responsibility • Experience automating infrastructure and build processes • Outstanding programming and problem solving skills • Strong passion for technology and building great systems • Excellent communication skills and ability to work using Agile methodologies • Ability to work quickly and collaboratively in a fast-paced, entrepreneurial environment • Experience with service-oriented (SOA) and event-driven (EDA) architectures • Experience using big data solutions in an AWS environment • Experience with javascript or associated If you think you would be a good fit for this role, and are interested in joining the best engineering team in IoT, please provide your resume

Job posted by
apply for job
apply for job
Arjun Ravindran picture
Arjun Ravindran
Job posted by
Arjun Ravindran picture
Arjun Ravindran
Apply for job
apply for job

Data ETL Engineer

Founded 2013
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Chennai
Experience icon
1 - 3 years
Experience icon
5 - 12 lacs/annum

Responsibilities: Design and develop ETL Framework and Data Pipelines in Python 3. Orchestrate complex data flows from various data sources (like RDBMS, REST API, etc) to the data warehouse and vice versa. Develop app modules (in Django) for enhanced ETL monitoring. Device technical strategies for making data seamlessly available to BI and Data Sciences teams. Collaborate with engineering, marketing, sales, and finance teams across the organization and help Chargebee develop complete data solutions. Serve as a subject-matter expert for available data elements and analytic capabilities. Qualification: Expert programming skills with the ability to write clean and well-designed code. Expertise in Python, with knowledge of at least one Python web framework. Strong SQL Knowledge, and high proficiency in writing advanced SQLs. Hands on experience in modeling relational databases. Experience integrating with third-party platforms is an added advantage. Genuine curiosity, proven problem-solving ability, and a passion for programming and data.

Job posted by
apply for job
apply for job
Vinothini Sundaram picture
Vinothini Sundaram
Job posted by
Vinothini Sundaram picture
Vinothini Sundaram
Apply for job
apply for job

Sr Data Scientist

Founded 2016
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai
Experience icon
3 - 5 years
Experience icon
6 - 10 lacs/annum

About us: GreyAtom is a Mumbai-based Ed-tech company, specializing in upskilling tech professionals through harnessing the power of data science. We are a turnkey solution to upgrading your skill set and career prospects. Data Science team at GreyAtom is going forward with mission of building systemic intelligence across GreyAtom product and ecosystem. We are looking forward to team player who understands Software Engineering, Data Science and has good grasp on business. Some Of The Problems We Are Focusing On Currently Are What is learner’s competency across various modules? How does a learner compare against other people in ecosystem ? Does my learning behaviour match that of people who got jobs in Data Science? Attrition Alert for the Student Personalization of learning path for each student Factors that predict learner’s success or drop out risk Mapping the Skill & Competency matrix for each learner At GreyAtom, Data scientists are embedded with the engineering and product team for the problem they are working on. This ensures that the data science solutions are envisioned along with product delivery. We have a very flat structure within the Data Science team, which enables us to focus on excellence and create a deep sense of ownership. Also being a young team we are able to democratize the process of problem selection. Our techniques span classification, clustering, matrix factorization, graphical models, networks and graph algorithms, topic modeling, image processing, deep learning and NLP, each one of them being exercised at a fairly large scale. If you want to challenge the state of the art and want to impact the wide open landscape in India, GreyAtom Data Science team is the place for you. A passionate data scientist who has experience in executing and evangelizing the Machine Learning or AI technologies in solving business problems resulting in uncompromised user experience cost savings and eliciting business insights buried in big data. Your Impact towards CA In this role, you'll help support GreyAtom charter to build Dataware by:   Communicating with scientists as well as engineers. Bringing about significant innovation and solving complex problems in projects based on analytics May have indirect reports and manage a small project team. Mentoring, training, developing and serving as a knowledge resource for less experienced Software Engineers and Data professionals. Work and collaborate with the Product team to build Data Science into Commit.Live Conceptualise, design and deliver high-quality solutions and insightful analysis Conduct research and prototyping innovations; data and requirements gathering; solution scoping and architecture; Skills Required: Typically, 3 or more years of experience executing on projects as a lead and analytic computing experience. Mathematical skills including Statistics fundamentals, Statistical Modelling, Regression analysis, Time Series, Decision Trees, Correlation (Clustering, Association rules, K-Nearest Neighbours) Analytical skills include Data Analytics, Data Modelling, Machine Learning, Text Mining, Optimization Simulation skills (Genetic Algorithms, Monte Carlo Simulations, Linear Programming, Quadratic Programming etc) Data-Driven Problem Solving and Data Munging Papers published in journals in ML/AI area will have added advantage Hands-on experience of Python Understanding and manipulation of unstructured data Has experience with one or more cloud or devops services like AWS, Docker etc. Good business acumen of any vertical, preferably Edtech/ Learning Analytics vertical

Job posted by
apply for job
apply for job
Shareena Fernandes picture
Shareena Fernandes
Job posted by
Shareena Fernandes picture
Shareena Fernandes
Apply for job
apply for job

Java Developer

Founded 2012
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[ - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
2 - 20+ years
Experience icon
8 - 30 lacs/annum

Looking for extremely smart software engineers who can solve complex distributed software issues. Someone who has handled lots of structured and unstructured data is preferred.

Job posted by
apply for job
apply for job
Prasanna Gopinath picture
Prasanna Gopinath
Job posted by
Prasanna Gopinath picture
Prasanna Gopinath
Apply for job
apply for job

Tech Lead Backend

Founded 2016
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
5 - 10 years
Experience icon
15 - 50 lacs/annum

RESPONSIBILITIES:   1. Full ownership of Tech right from driving product decisions to architect to deployment. 2. Develop cutting edge user experience and build cutting edge technology solutions like instant messaging in poor networks, live-discussions, live-videos optimal matching. 3. Using Billions of Data Points to Build User Personalization Engine.  4. Building Data Network Effects Engine to increase Engagement & Virality.  5. Scaling the Systems to Billions of Daily Hits. 6. Deep diving into performance, power management, memory optimization & network connectivity optimization for the next Billion Indians.  7. Orchestrating complicated workflows, asynchronous actions, and higher order components.  8. Work directly with Product and Design teams. REQUIREMENTS:   1. Should have Hacked some (computer or non-computer) system to your advantage. 2. Built and managed systems with a scale of 10Mn+ Daily Hits 3. Strong architectural experience.  4. Strong experience in memory management, performance tuning and resource optimizations.  5. PREFERENCE- If you are a woman or an ex-entrepreneur or having a CS bachelors degree from IIT/BITS/NIT. P.S. If you don't fulfill one of the requirements, you need to be exceptional in the others to be considered.

Job posted by
apply for job
apply for job
Shubham Maheshwari picture
Shubham Maheshwari
Job posted by
Shubham Maheshwari picture
Shubham Maheshwari
Apply for job
apply for job

Data Scientist/ ML engineer

Founded 2012
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
3 - 7 years
Experience icon
10 - 25 lacs/annum

HackerEarth provides enterprise software solutions that help organisations in their innovation management and talent assessment needs. HackerEarth Recruit is a talent assessment platform that helps in efficient technical talent screening allowing organisations to build strong, proficient teams. HackerEarth Sprint is an innovation management software that helps organisations drive innovation through internal and external talent pools, including HackerEarth’s global community of 2M+ developers. Today, HackerEarth serves 750+ organizations, including leading Fortune 500 companies from around the world. General Electric, IBM, Amazon, Apple, Wipro, Walmart Labs and Bosch are some of the brands that trust HackerEarth in helping them drive growth. Job Description We are looking for an ML Engineer that will help us discover the information hidden in vast amounts of data, and help us make smarter decisions to deliver even better products. Your primary focus will be on applying data mining techniques, doing statistical analysis, and building high-quality models integrated with our products. You will be primarily working on recommendation engines, text classification, automated tagging of documents, lexical similarity, semantic similarity and similar problems to start with. Responsibilities Selecting features, building and optimizing classifiers using machine learning techniques Data mining using state-of-the-art methods Extending the company’s data with third-party sources of information when needed Enhancing data collection procedures to include information that is relevant for building analytic systems Processing, cleansing, and verifying the integrity of data used for analysis Doing the ad-hoc analysis and presenting results in a clear manner Creating automated anomaly detection systems and constant tracking of its performance Develop custom data models and algorithms to apply to data sets Assess the effectiveness and accuracy of new data sources and data gathering techniques Coordinate with different functional teams to implement models and monitor outcomes. Develop processes and tools to monitor and analyze model performance and data accuracy. Skills and Qualifications 4+ years of experience using statistical computer languages (R, Python, etc.) to manipulate data and draw insights from large data sets. Good applied statistics skills, such as distributions, statistical testing, regression, etc. Proficiency in using query languages such as SQL, Hive, Pig Experience with distributed data/computing tools: MapReduce, Hadoop, Hive, Spark, etc. Experience using web services: Redshift, S3, etc. Experience creating and using advanced machine learning algorithms and statistics: regression, simulation, scenario analysis, modelling, clustering, decision trees, neural networks, etc. Knowledge and experience in statistical and data mining techniques: GLM/Regression, Random Forest, Boosting, Trees, text mining, social network analysis, etc. Experience working with and creating data architectures. Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural networks, etc.) and their real-world advantages/drawbacks.. Excellent written and verbal communication skills for coordinating across teams. A drive to learn and master new technologies and techniques. You should be creative, enthusiastic, and take pride in the work that you produce. Above all, you should love to build and ship solutions that real people will use every day.

Job posted by
apply for job
apply for job
Partha Dewri picture
Partha Dewri
Job posted by
Partha Dewri picture
Partha Dewri
Apply for job
apply for job

Technical Lead - Big Data and Java

Founded 1997
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Pune
Experience icon
2 - 7 years
Experience icon
1 - 20 lacs/annum

Description Does solving complex business problems and real world challenges interest you Do you enjoy seeing the impact your contributions make on a daily basis Are you passionate about using data analytics to provide game changing solutions to the Global 2000 clients Do you thrive in a dynamic work environment that constantly pushes you to be the best you can be and more Are you ready to work with smart colleagues who drive for excellence in everything they do If you possess a solutions mindset, strong analytical skills, and commitment to be part of a tremendous journey, come join our growing, global team. See what Saama can do for your career and for your journey. Position: Java/ Big Data Lead (2162) Location: Hinjewadi Phase 1, Pune Type: Permanent Full time Requirements: Candidate should be able - Define application level architecture and guide low level of Database design Gather technical requirements and propose solutions based on client s business and architectural needs Interact with prospective customers during product demos/ evaluations Internally work with technology and business groups to define project specifications Showcase experience on cloud based implementations and technically manage Bigdata and j2EE projects Showcase experience hands-on programming and debugging skills on Spring, Hibernate, Java, JavaScript, JSP/ Servlet, J2EE design patterns / Python Have knowledge on service Integration Concepts (especially with RESTFUL services/ SOAP based web services) Design and develop solutions for Non-Functional Requirements (Performance analysis & tuning, Benchmarking/ load testing, Security) Impact on the business: Plays an important role in making Saama s Solutions game changers for our strategic partners by using data science to solve core, complex business challenges. Key relationships: Sales & pre-sales Product management Engineering Client organization: account management & delivery Saama Competencies: INTEGRITY: we do the right things. INNOVATION: we change the game. TRANSPARENCY: we communicate openly COLLABORATION: we work as one team PROBLEM-SOLVING: we solve core, complex business challenges ENJOY & CELEBRATE: we have fun Competencies: Self-starter who gets results with minimal support and direction in a fast-paced environment. Takes initiative; challenges the status quo to drive change. Learns quickly; takes smart risks to experiment and learn. Works well with others; builds trust and maintains credibility. Planful: identifies and confirms key requirements in dynamic environments; anticipates tasks and contingencies. Communicates effectively; productive communication with clients and all key stakeholders communication in both verbal and written communication. Stays the course despite challenges & setbacks. Works well under pressure. Strong analytical skills; able to apply inductive and deductive thinking to generate solutions for complex problems

Job posted by
apply for job
apply for job
Sandeep Chaudhary picture
Sandeep Chaudhary
Job posted by
Sandeep Chaudhary picture
Sandeep Chaudhary
Apply for job
apply for job

Data Scientist

Founded 1997
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Pune
Experience icon
4 - 8 years
Experience icon
1 - 16 lacs/annum

Description Must have Direct Hands- on, 4 years of experience, building complex Data Science solutions Must have fundamental knowledge of Inferential Statistics Should have worked on Predictive Modelling, using Python / R Experience should include the following, File I/ O, Data Harmonization, Data Exploration Machine Learning Techniques (Supervised, Unsupervised) Multi- Dimensional Array Processing Deep Learning NLP, Image Processing Prior experience in Healthcare Domain, is a plus Experience using Big Data, is a plus Should have Excellent Analytical, Problem Solving ability. Should be able to grasp new concepts quickly Should be well familiar with Agile Project Management Methodology Should have excellent written and verbal communication skills Should be a team player with open mind

Job posted by
apply for job
apply for job
Sandeep Chaudhary picture
Sandeep Chaudhary
Job posted by
Sandeep Chaudhary picture
Sandeep Chaudhary
Apply for job
apply for job

Bigdata Lead

Founded 1997
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Pune
Experience icon
2 - 5 years
Experience icon
1 - 18 lacs/annum

Description Deep experience and understanding of Apache Hadoop and surrounding technologies required; Experience with Spark, Impala, Hive, Flume, Parquet and MapReduce. Strong understanding of development languages to include: Java, Python, Scala, Shell Scripting Expertise in Apache Spark 2. x framework principals and usages. Should be proficient in developing Spark Batch and Streaming job in Python, Scala or Java. Should have proven experience in performance tuning of Spark applications both from application code and configuration perspective. Should be proficient in Kafka and integration with Spark. Should be proficient in Spark SQL and data warehousing techniques using Hive. Should be very proficient in Unix shell scripting and in operating on Linux. Should have knowledge about any cloud based infrastructure. Good experience in tuning Spark applications and performance improvements. Strong understanding of data profiling concepts and ability to operationalize analyses into design and development activities Experience with best practices of software development; Version control systems, automated builds, etc. Experienced in and able to lead the following phases of the Software Development Life Cycle on any project (feasibility planning, analysis, development, integration, test and implementation) Capable of working within the team or as an individual Experience to create technical documentation

Job posted by
apply for job
apply for job
Sandeep Chaudhary picture
Sandeep Chaudhary
Job posted by
Sandeep Chaudhary picture
Sandeep Chaudhary
Apply for job
apply for job

Big Data Architect

Founded 1997
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Pune
Experience icon
2 - 5 years
Experience icon
1 - 30 lacs/annum

Description Does solving complex business problems and real world challenges interest you Do you enjoy seeing the impact your contributions make on a daily basis Are you passionate about using data analytics to provide game changing solutions to the Global 2000 clients Do you thrive in a dynamic work environment that constantly pushes you to be the best you can be and more Are you ready to work with smart colleagues who drive for excellence in everything they do If you possess a solutions mind set , strong technological expertise , and commitment to be part of a tremendous journey , come join our growing , global team. See what Saama can do for your career and for your journey. Impact on the business: Candidate would play a key role in delivering success by leveraging Web and Big Data technologies and tools to fulfill client s business objectives. Responsibilities: Participate in requirement gathering sessions with Business users and stakeholders to understand the business needs. Understand functional and non - functional requirements and define technical Architecture and design to cater to the same. Produce a detailed technical design document to match the solution design specifications. Review and validate effort estimates produced by development team for design and build phases. Understand and apply company s solutions / frameworks to the design when needed. Collaborate with the development team to produce a technical specification for custom development and systems integration requirements. Participate and lead , when needed , the project meetings with the customer. Collaborate with senior architects in customer organization and convince / defend design and architecture decisions for the project. Be technical mentor to the development team. Required Skills Experience in designing scalable complex distributed systems. Hands on development experience in Big Data Hadoop ecosystem & Analytics space Experience working with Cloud Storage solutions in AWS , Azure etc. MS / BS degree in Computer Science , Mathematics , Engineering or related field. 12 years of experience as a technology leader designing and developing data architecture solutions with more than 2 years specializing in big data architecture or data analytics. Experience of implementing solutions using Big data technologies - Hadoop , Map / Reduce , Pig , Hive , Spark , Storm , Impala , Oozie , Flume , ZooKeeper , Sqoop etc Good understanding of NoSQL and prior experience working with NoSQL databases Hbase , MongoDB , Cassandra , Competencies: Self - starter who gets results with minimal support and direction in a fast - paced environment. Takes initiative; challenges the status quo to drive change. Learns quickly; takes smart risks to experiment and learn. Works well with others; builds trust and maintains credibility. Identifies and confirms key requirements in dynamic environments; anticipates tasks and contingencies. Strong analytical skills; able to apply creative thinking to generate solutions for complex problems Communicates effectively; productive communication with clients and all key stakeholders (both verbal and written communication).

Job posted by
apply for job
apply for job
Sandeep Chaudhary picture
Sandeep Chaudhary
Job posted by
Sandeep Chaudhary picture
Sandeep Chaudhary
Apply for job
apply for job

Senior Software Engineer

Founded 2013
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
NCR (Delhi | Gurgaon | Noida)
Experience icon
4 - 6 years
Experience icon
15 - 18 lacs/annum

Requirements: Minimum 4-years work experience in building, managing and maintaining Analytics applications B.Tech/BE in CS/IT from Tier 1/2 Institutes Strong Fundamentals of Data Structures and Algorithms Good analytical & problem-solving skills Strong hands-on experience in Python In depth Knowledge of queueing systems (Kafka/ActiveMQ/RabbitMQ) Experience in building Data pipelines & Real time Analytics Systems Experience in SQL (MYSQL) & NoSQL (Mongo/Cassandra) databases is a plus Understanding of Service Oriented Architecture Delivered high-quality work with a significant contribution Expert in git, unit tests, technical documentation and other development best practices Experience in Handling small teams

Job posted by
apply for job
apply for job
tanika monga picture
tanika monga
Job posted by
tanika monga picture
tanika monga
Apply for job
apply for job

Machine learning Developer

Founded 2017
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
NCR (Delhi | Gurgaon | Noida)
Experience icon
1 - 5 years
Experience icon
15 - 16 lacs/annum

We are looking for a Machine Learning Developer who possesses apassion for machine technology & big data and will work with nextgeneration Universal IoT platform.Responsibilities:•Design and build machine that learns , predict and analyze data.•Build and enhance tools to mine data at scale• Enable the integration of Machine Learning models in Chariot IoTPlatform•Ensure the scalability of Machine Learning analytics across millionsof networked sensors•Work with other engineering teams to integrate our streaming,batch, or ad-hoc analysis algorithms into Chariot IoT's suite ofapplications•Develop generalizable APIs so other engineers can use our workwithout needing to be a machine learning expert

Job posted by
apply for job
apply for job
Raj Garg picture
Raj Garg
Job posted by
Raj Garg picture
Raj Garg
Apply for job
apply for job

Senior Software Engineer - Backend

Founded 2015
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Pune
Experience icon
5 - 10 years
Experience icon
17 - 25 lacs/annum

Responsibilities Ensure timely and top-quality product delivery Ensure that the end product is fully and correctly defined and documented Ensure implementation/continuous improvement of formal processes to support product development activities Drive the architecture/design decisions needed to achieve cost-effective and high-performance results Conduct feasibility analysis, produce functional and design specifications of proposed new features. · Provide helpful and productive code reviews for peers and junior members of the team. Troubleshoot complex issues discovered in-house as well as in customer environments. Qualifications · Strong computer science fundamentals in algorithms, data structures, databases, operating systems, etc. · Expertise in Java, Object Oriented Programming, Design Patterns · Experience in coding and implementing scalable solutions in a large-scale distributed environment · Working experience in a Linux/UNIX environment is good to have · Experience with relational databases and database concepts, preferably MySQL · Experience with SQL and Java optimization for real-time systems · Familiarity with version control systems Git and build tools like Maven · Excellent interpersonal, written, and verbal communication skills · BE/B.Tech./M.Sc./MCS/MCA in Computers or equivalent

Job posted by
apply for job
apply for job
Sourabh Gandhe picture
Sourabh Gandhe
Job posted by
Sourabh Gandhe picture
Sourabh Gandhe
Apply for job
apply for job

Frontend Engineer - HTML/CSS/Javascript

Founded 2000
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai
Experience icon
2 - 5 years
Experience icon
7 - 20 lacs/annum

Role Brief :2-3 years of experience in JavaScript (including ES2015), HTML5, SVG, CSS3/SCSS. The engineer will use his/her skills & experience to translate graphical designs to delightful implementations, manage and enhance the application user experience, and help train other developers.Brief about the Team & Fractal :Fractal Analytics is Leading Fortune 500 companies to leverage Big Data, analytics, and technology to drive smarter, faster and more accurate decisions in every aspect of their business.Trial Run is a cloud-based product that is helping business to solve bigger enterprise problem.Our team is a mix of passionate data scientists, engineers, product managers and domain experts.Together, our mission is to build world-class products to help our customers create a culture of experimentation.Job Responsibilities :- Review designs created by web designers, clarify issues and implement the designs- Implement intuitive and interactive visualizations to present analytical insights to users- Collaborate closely with designers, engineers, client support, and data scientists to implement and improve the functionality of the application.- Train other developers on new technologies.- Follow and introduce industry best practices in software development lifecycle.- Optimize and debug applications.- Maintain updated knowledge of the development industry and any advancements in technologyExperience :Must Have : 2-3 years of experience in JavaScript (including ES2015), HTML5, SVG, CSS3/SCSS Good to Have : Proficiency working with JavaScript libraries, Server frameworks, control systems, testing libraries, testing libraries, Rest APIsEducation : Any

Job posted by
apply for job
apply for job
Mili Panicker picture
Mili Panicker
Job posted by
Mili Panicker picture
Mili Panicker
Apply for job
apply for job

Senior Software Engineer

Founded 2008
Products and services{{j_company_types[2 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Hyderabad
Experience icon
6 - 11 years
Experience icon
10 - 20 lacs/annum

Description Who We Are Bridge International Academies is the world s largest and fastest -growing chain of primary and pre -primary schools with more than 500 academies and 100,000 pupils in Kenya, Uganda, Nigeria, India, and Liberia. We democratize the right to succeed by giving families living in poverty access to the high -quality education that will allow their children to live a very different life. We leverage experts, data, and technology in order to standardize and scale every aspect of quality education delivery, from how and where academies are built to how teachers are selected and trained, and how lessons are delivered and monitored for improvement. We are vertically -integrated, tech -enabled, and on our way to profitability. Bridge expects to continue rapid expansion in 2018 across existing markets. The Bridge Offer Roughly 2.7 billion people live on less than $2 /day. In their communities, there is a huge gap between the education offered and the needs of the population. Too often the schools available to them fail to deliver for these families. The quality offered results in the average pupil from our communities in East Africa failing to reach proficiency in primary school and on average fail the primary exit exams that are critical to their development. Teachers are unresponsive and occasionally abusive, and fees are often unaffordable. Even government schools can cost families a significant amount of money after all the additional fees are added up. With 47% of classroom teaching time lost due to teacher absenteeism or neglect, 55% of families in our communities end up choosing private schools instead, but then fear for the stability and sustainability of their choice as many schools close after only a few years of service. Both the government schools and the private schools tend to lack well -conceived scope and sequences, instructional materials, student achievement data, and the capacity to react to that data. Families are actively searching for a better academic alternative. Enter Bridge International Academies. As of September 2017, Bridge operates more than 500 academies, serving roughly 100,000 pupils in Kenya, Uganda, Nigeria, India, and Liberia. Bridge utilises a scripted -learning education methodology coupled with 'big data' (all teachers have tablets for instruction, assessment, and data -gathering) that allows us to make curriculum a little better every day. With plans to enrol ten million students ten years from now, Bridge International Academies offers a tremendous opportunity to grow with one of the world s most exciting, ambitious, and socially conscious companies, with leadership roles available across a number of competencies and geographies. Tech at Bridge Technology plays a critical role at Bridge in enabling us to provide education at massive scale and low cost - it's one of the key elements that gives us the ability to deliver what no one else can. Tech spans several key functions, from the hardware and software that our academies use to run all aspects of teaching and management, including mobile payments, to the systems that enable our country headquarters to manage massive local operations, to the data backbone that informs all of our strategic and tactical decision making. It s a lot of custom software development and a lot of back office systems. We've got a ridiculously ambitious mission at Bridge, and it's a place where passionate technologists have a chance to directly change the world. No kidding. About the Role Tech at Bridge is a highly complex, vertically -integrated affair, with systems supporting an ever expanding range of functions and countries, and crossing between software development, IT operations, academy operations, and logistics /supply chain. At the same time, our teams run lean and things change fast - governments make policy decisions that affect us, launching new countries is a frenetic affair, and we still need to evolve our core technology offering. We are looking for a full time Senior Software Engineer to join our new Hyderabad -based cross -functional software development team, which will participate in building the software that powers and improves efficiency to enhance our competitive advantage. This person should be familiar with design and implementation issues specific to a data driven, highly scalable environments and be able to handle such issues with flexibility and ingenuity. The ideal candidate will have a strong customer focus, a proven track record of delivering high -quality products in a continuous delivery environment, and an appreciation for clean and simple code. Bridge especially values T -shaped team members - individuals with deep expertise in particular areas, but comfortable working across all parts of the technology stack. What You Will Do Assume ownership over the server -side architecture of the Bridge software platforms Design, implement, and support new products and features Analyse and improve the server -side architecture with a focus on maintainability and scalability Mentor and guide junior engineers, including performing code reviews Collaborate with project sponsors to elaborate requirements and facilitate trade -offs that maximise customer value Work with product and development teams to establish overall technical direction and product strategy What You Will Have You have a BA /BS in Computer Science or related technical field. You have 6 years of enterprise software development experience. You are comfortable recommending and advocating for enterprise architectural best practices for highly -available, scalable, and reliable implementations. You have direct experience integrating off -the -shelf and custom built software, and understand the trade -offs between building and buying software. You function well in a fast -paced, informal environment where constant change is the norm and the bar for quality is set high. You have enterprise -level experience with continuous delivery practices and tools (e.g Jenkins, Bamboo, GoCD, Octopus). Proficiency in test -driven development (TDD) and /or behaviour driven development (BDD) is required. You are in expert in four or more of the following areas and interested in learning the rest: C# /.NET Web services (esp. WebAPI or NancyFx; Richardson L2 ) Cloud environments (esp. AWS) and architectures /implementations (e.g. CQRS /ES, circuit breakers, messaging, etc.) Enterprise application performance monitoring (e.g. E.L.K., Nagios, NewRelic, Riverbed) System security (e.g. OWASP, OAuth) Infrastructure -as -Code (e.g. Puppet, Chef, Ansible, Docker, boxstarter, chocolatey /WinRM /powershell). MS SQL Server /T -SQL You must have worked in an agile delivery environment and understand not only the mechanics, but also the underlying motivations. Bridge is primarily a .NET shop (server -side), so experience in this area is preferable; however, Bridge also values developers with diverse experience, so serious exposure to other languages and ecosystems (e.g. NodeJS, Ruby, functional languages, NoSQL DBs) is a bonus. Bridge is a strong supporter of open source projects - familiarity with OSS projects is a plus; contributions to open source projects is a big plus. You re also A detailed doer - You have a track record of getting things done. You re organized and responsive. You take ownership of every idea you touch and execute it to a fine level of detail, setting targets, engaging others, and doing whatever it takes to get the job done. You can multi -task dozens of such projects at once and never lose sight of the details. Likely, you have some experience in a start -up or other rapid -growth company. A networking mastermind - You excel at meeting new people and turning them into advocates. You communicate in a clear, conscientious, and effective way in both written and oral speech. You can influence strangers in the course of a single conversation. Allies and colleagues will go to bat for your ideas. A creative problem -solver - Growing any business from scratch comes with massive and constant challenges. On top of that, Bridge works in volatile, low -resource communities and runs on fees averaging just $6 a month per pupil. You need to be flexible and ready to get everything done effectively, quickly, and affordably with the materials at hand. Every dollar you spend is a dollar our customers, who live on less than $2 a day, will have to pay for. A customer advocate - Our customers - these families living on less than $2 a day per person - never leave your mind. You know them, get them, have shared a meal with them (or would be happy to in the future). You would never shrink back from shaking a parent s hand or picking up a crying child, no matter what the person was wearing or looked like. Every decision you make considers their customer benefit, experience, and value. A life -long learner - You believe you can always do better. You welcome constructive criticism and provide it freely to others. You know you only get better tomorrow when others point out where you ve missed things or failed today.

Job posted by
apply for job
apply for job
Payyalore Sainath picture
Payyalore Sainath
Job posted by
Payyalore Sainath picture
Payyalore Sainath
Apply for job
apply for job

Big Data Engineer

Founded 2012
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[1 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai
Experience icon
2 - 7 years
Experience icon
4 - 20 lacs/annum

As a Big Data Engineer, you will build utilities that would help orchestrate migration of massive Hadoop/Big Data systems onto public cloud systems. You would build data processing scripts and pipelines that serve several of jobs and queries per day. The services you build will integrate directly with cloud services, opening the door to new and cutting-edge re-usable solutions. You will work with engineering teams, co-workers, and customers to gain new insights and dream of new possibilities. The Big Data Engineering team is hiring in the following areas: • Distributed storage and compute solutions • Data ingestion, consolidation, and warehousing • Cloud migrations and replication pipelines • Hybrid on-premise and in-cloud Big Data solutions • Big Data, Hadoop and spark processing Basic Requirements: • 2+ years’ experience of Hands-on in data structures, distributed systems, Hadoop and spark, SQL and NoSQL Databases • Strong software development skills in at least one of: Java, C/C++, Python or Scala. • Experience building and deploying cloud-based solutions at scale. • Experience in developing Big Data solutions (migration, storage, processing) • BS, MS or PhD degree in Computer Science or Engineering, and 5+ years of relevant work experience in Big Data and cloud systems. • Experience building and supporting large-scale systems in a production environment. Technology Stack: Cloud Platforms – AWS, GCP or Azure Big Data Distributions – Any of Apache Hadoop/CDH/HDP/EMR/Google DataProc/HD-Insights Distributed processing Frameworks – One or more of MapReduce, Apache Spark, Apache Storm, Apache Flink. Database/warehouse – Hive, HBase, and at least one cloud-native services Orchestration Frameworks – Any of Airflow, Oozie, Apache NiFi, Google DataFlow Message/Event Solutions – Any of Kafka, Kinesis, Cloud pub-sub Container Orchestration (Good to have)– Kubernetes or Swarm

Job posted by
apply for job
apply for job
Anwar Shaikh picture
Anwar Shaikh
Job posted by
Anwar Shaikh picture
Anwar Shaikh
Apply for job
apply for job

Senior Data Engineer

Founded 2011
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
3 - 7 years
Experience icon
4 - 12 lacs/annum

Sr Data Engineer Job Description About Us DataWeave is a Data Platform which aggregates publicly available data from disparate sources and makes it available in the right format to enable companies take strategic decisions using trans-firewall Analytics. It's hard to tell what we love more, problems or solutions! Every day, we choose to address some of the hardest data problems that there are. We are in the business of making sense of messy public data on the web. At serious scale! Requirements: - Building an intelligent and highly scalable crawling platform - Data extraction and processing at scale - Enhancing existing data stores/data models - Building a low latency API layer for serving data to power Dashboards, Reports, and Analytics functionality - Constantly evolving our data platform to support new features Expectations: - 4+ years of relevant industry experience. - Strong in algorithms and problem solving Skills - Software development experience in one or more general purpose programming languages (e.g. Python, C/C++, Ruby, Java, C#). - Exceptional coding abilities and experience with building large-scale and high-availability applications. - Experience in search/information retrieval platforms like Solr, Lucene and ElasticSearch. - Experience in building and maintaining large scale web crawlers. - In Depth knowledge of SQL and and No-Sql datastore. - Ability to design and build quick prototypes. - Experience in working on cloud based infrastructure like AWS, GCE. Growth at DataWeave - Fast paced growth opportunities at dynamically evolving start-up. - You have the opportunity to work in many different areas and explore wide variety of tools to figure out what really excites you.

Job posted by
apply for job
apply for job
Sandeep Sreenath picture
Sandeep Sreenath
Job posted by
Sandeep Sreenath picture
Sandeep Sreenath
Apply for job
apply for job

Artificial Intelligence Developers

Founded 2016
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
NCR (Delhi | Gurgaon | Noida)
Experience icon
1 - 3 years
Experience icon
3 - 9 lacs/annum

-Precily AI: Automatic summarization, shortening a business document, book with our AI. Create a summary of the major points of the original document. AI can make a coherent summary taking into account variables such as length, writing style, and syntax. We're also working in the legal domain to reduce the high number of pending cases in India. We use Artificial Intelligence and Machine Learning capabilities such as NLP, Neural Networks in Processing the data to provide solutions for various industries such as Enterprise, Healthcare, Legal.

Job posted by
apply for job
apply for job
Bharath Rao picture
Bharath Rao
Job posted by
Bharath Rao picture
Bharath Rao
Apply for job
apply for job

Sr Java Application Developer for Enterprise Data Analytics Platform

Founded 2014
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[1 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Pune
Experience icon
3 - 7 years
Experience icon
4 - 12 lacs/annum

Must have skills: -Very strong coding skills on Core Java (1.5 and above) -Should be able to analyze complex code structures, data structures, algorithms/logic -Should have hands on knowledge of working on Java -Multithreading (juml)programs -Should have expertise in Java Collection framework -Must have good exposure on Struts/JSP services/Jquery/Ajax, Json-based UI rendering Good to have skills (not mandatory): -Good working knowledge on Java script/Jquery framework -Should have used HTML5/CSS5/Node.js/D3 framework in atleast one of the projects earlier -Hands on latest technologies like Cassandra, Solr, Hadoop would be an advantage -Knowledge on Graph structures would be desirable

Job posted by
apply for job
apply for job
Neha Ambastha picture
Neha Ambastha
Job posted by
Neha Ambastha picture
Neha Ambastha
Apply for job
apply for job

DevOps Head

Founded 2009
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Pune
Experience icon
8 - 12 years
Experience icon
5 - 15 lacs/annum

DevOps Architect, responsible for designing & implementing the Devops related work task and clarify the System/Deployment related issue directly with customer

Job posted by
apply for job
apply for job
Pankaj Gajjar picture
Pankaj Gajjar
Job posted by
Pankaj Gajjar picture
Pankaj Gajjar
Apply for job
apply for job

Technical Architect

Founded 2009
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Pune
Experience icon
3 - 15 years
Experience icon
2 - 12 lacs/annum

Work on different POC Experience in Java/J2ee programming and coding. many more ..

Job posted by
apply for job
apply for job
Pankaj Gajjar picture
Pankaj Gajjar
Job posted by
Pankaj Gajjar picture
Pankaj Gajjar
Apply for job
apply for job

DevOps Engineer

Founded 2009
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Pune
Experience icon
3 - 10 years
Experience icon
3 - 8 lacs/annum

LAMP stack configuration of ES and Cassandra, replication, fixing performance issues in the infrastructure consulting

Job posted by
apply for job
apply for job
Pankaj Gajjar picture
Pankaj Gajjar
Job posted by
Pankaj Gajjar picture
Pankaj Gajjar
Apply for job
apply for job

Python Developer (3-5 years exp)

Founded 2016
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
3 - 5 years
Experience icon
2 - 10 lacs/annum

Artificial Learning Systems India Pvt. Ltd. is looking for an exceptional Python Developers who will have a good background in, and understanding of, software systems, and one who has the ability to work closely with the rest of the Engineering team from the early stages of design all the way through identifying and resolving production issues. Candidate Profile: The ideal candidate will be passionate about this role which involves deep knowledge of both the application and the product, and he/she will also believe that automation is key to operating large-scale systems. Education: BE/B.Tech. from reputed College Technical skills required: • 3+ years’ experience as a web developer in Python • Software design skills in product development • Proficiency in a modern open-source NoSQL database, preferably Cassandra • Proficient in HTTP protocol, REST APIs, JSON • Experience with Flask (Must have) Django (Good to have) • Experience with Gunicorn, Celery, RabbitMQ, Supervisor Job Type: Full time, permanent Job Location: Bangalore Who are we? Artificial Learning systems (Artelus) is a 2 year young company, working in the Deep Learning space to solve healthcare problems. The company seeks to make products, which would complement the knowledge and assist clinicians in making faster and more accurate diagnoses. Our team comprises a group of dedicated scientists trying to make the world a healthier place using the latest advances in computer science and machine learning and applying it to the field of medicine and healthcare. Why work with Artelus? We are working on exciting new scientific developments in the area of healthcare, and working with us will get you solid education whatever your level of experience. This is a very exciting opportunity for a young scientist and we look forward to working with you to help you to develop your skills in our R&D center. What does working with Artelus mean to you? • Working in a high energy and challenging environment • Work with International clients • Work in cutting edge technologies • Be a part of an exciting path breaking project • Great environment to work in

Job posted by
apply for job
apply for job
Bindu Varma picture
Bindu Varma
Job posted by
Bindu Varma picture
Bindu Varma
Apply for job
apply for job

Python Tech Lead (5-8 years of experience)

Founded 2016
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
5 - 8 years
Experience icon
5 - 24 lacs/annum

Artificial Learning Systems India Pvt. Ltd. is looking for an exceptional Technical Lead who will work with the engineering team and will have a good background in, and understanding of, software systems, and one who has the ability to work closely with the rest of the Engineering team from the early stages of design all the way through identifying and resolving production issues. Candidate Profile: The ideal candidate will be passionate about this role which involves deep knowledge of both the application and the product, and he/she will also believe that automation is key to operating large-scale systems. She/he should have created and led an application development from scratch. Should have good scaling experience of DB. Education: BE/B.Tech. from reputed College Technical skills required: • 5+ years’ experience as a backend web developer in Python • Software design skills in product development • Strong understanding of the 3 key areas of web application architecture server backend, frontend presentation (HTML, CSS) and interactive web (JavaScript and jQuery) • Proficiency in a modern open-source NoSQL database, preferably Cassandra • Proficient in HTTP protocol, REST APIs, JSON • Ability to do database design and modeling • Experience with Flask (Must have) Django (Good to have) • Experience with Gunicorn, Celery, RabbitMQ • Experience in AWS

Job posted by
apply for job
apply for job
Bindu Varma picture
Bindu Varma
Job posted by
Bindu Varma picture
Bindu Varma
Apply for job
apply for job

Technical Architect/CTO

via auzmor
Founded 2017
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Chennai
Experience icon
3 - 10 years
Experience icon
10 - 30 lacs/annum

Description Auzmor is US HQ’ed, funded SaaS startup focussed on disrupting the HR space. We combine passion, domain expertise and build products with focus on great end user experiences We are looking for Technical Architect to envision, build, launch and scale multiple SaaS products What You Will Do: • Understand the broader strategy, business goals, and engineering priorities of the company and how to incorporate them into your designs of systems, components, or features • Designing applications and architectures for multi-tenant SaaS software • Responsible for the selection and use of frameworks, platforms and design patterns for Cloud based multi-tenant SaaS based application • Collaborate with engineers, QA, product managers, UX designers, partners/vendors, and other architects to build scalable systems, services, and products for our diverse ecosystem of users across apps What you will need • Minimum of 5+ years of Hands on engineering experience in SaaS, Cloud services environments with architecture design and definition experience using Java/JEE, Struts, Spring, JMS & ORM (Hibernate, JPA) or other Server side technologies, frameworks. • Strong understanding of architecture patterns such as multi-tenancy, scalability, and federation, microservices(design, decomposition, and maintenance ) to build cloud-ready systems • Experience with server-side technologies (preferably Java or Go),frontend technologies (HTML/CSS, Native JS, React, Angular, etc.) and testing frameworks and automation (PHPUnit, Codeception, Behat, Selenium, webdriver, etc.) • Passion for quality and engineering excellence at scale What we would love to see • Exposure to Big data -related technologies such as Hadoop, Spark, Cassandra, Mapreduce or NoSQL, and data management, data retrieval , data quality , ETL, data analysis. • Familiarity with containerized deployments and cloud computing platforms (AWS, Azure, GCP)

Job posted by
apply for job
apply for job
Loga B picture
Loga B
Job posted by
Loga B picture
Loga B
Apply for job
apply for job

Hadoop Developer

Founded 2016
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[1 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai
Experience icon
3 - 20+ years
Experience icon
4 - 15 lacs/annum

Looking for Big data Developers in Mumbai Location

Job posted by
apply for job
apply for job
Sheela P picture
Sheela P
Job posted by
Sheela P picture
Sheela P
Apply for job
apply for job

Aspirant - Data Science & AI

Founded 2012
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[1 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
0 - 10 years
Experience icon
3 - 9 lacs/annum

APPLY LINK: http://bit.ly/2yipqSE Go through the entire job post thoroughly before pressing Apply. There is an eleven characters french word v*n*i*r*t*e mentioned somewhere in the whole text which is irrelevant to the context. You shall be required to enter this word while applying else application won't be considered submitted. ````````````````````````````````````````````````````````````````````````````````````````````````````` Aspirant - Data Science & AI Team: Sciences Full-Time, Trainee Bangaluru, India Relevant Exp: 0 - 10 Years Background: Top Tier institute Compensation: Above Standards . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Busigence is a Decision Intelligence Company. We create decision intelligence products for real people by combining data, technology, business, and behavior enabling strengthened decisions. Scaling established startup by IIT alumni innovating & disrupting marketing domain through artificial intelligence. We bring those people onboard who are dedicated to deliver wisdom to humanity by solving the world’s most pressing problems differently thereby significantly impacting thousands of souls, everyday. We are a deep rooted organization with six years of success story having worked with folks from top tier background (IIT, NSIT, DCE, BITS, IIITs, NITs, IIMs, ISI etc.) maintaining an awesome culture with a common vision to build great data products. In past we have served fifty five customers and presently developing our second product, Robonate. First was emmoQ - an emotion intelligence platform. Third offering, H2HData, an innovation lab where we solve hard problems through data, science, & design. We work extensively & intensely on big data, data science, machine learning, deep learning, reinforcement learning, data analytics, natural language processing, cognitive computing, and business intelligence. First-and-Foremost Before you dive-in exploring this opportunity and press Apply, we wish you to evaluate yourself - We are looking for right candidate, not the best candidate. We love to work with someone who can mandatorily gel with our vision, beliefs, thoughts, methods, and values --- which are aligned with what can be expected in a true startup with ambitious goals. Skills are always secondary to us. Primarily, you must be someone who is not essentially looking for a job or career, rather starving for a challenge, you yourself probably don't know since when. A book can be written on what an applicant must have before joining a <real startup with meaningful product>. For brevity, in nutshell, we need these three in you: 1. You must be [super sharp] (Just an analogue, but Irodov, Mensa, Feynman, Polya, ACM, NIPS, ICAAC, BattleCode, DOTA etc should have been your Done stuff. Can you relate solution 1 to problem 2? or Do you get confused even when solved similar problem in past? Are you able to grasp problem statement in one go? or get hanged?) 2. You must be [extremely energetic] (Do you raise eyebrows when asked to stretch your limits, both in terms of complexity or extra hours to put in? What comes first in your mind, let's finish it today or this can be done tomorrow too? Its Friday 10 PM at work -Tired?) 3. You must be [honourably honest] (Do you tell others what you think, or what they want to hear? Later is good for sales team for their customers, not for this role. Are you honest with your work? intrinsically with yourself first?) You know yourself the best. If not ask your loved ones and then decide. We clearly need exceedingly motivated people with entrepreneurial traits, not employee mindset - not at all. This is an immediate requirement. We shall have an accelerated interview process for fast closure - you would be required to be proactive and responsive. Real ROLE We are looking for students, graduates, and experienced folks with real passion for algorithms, computing, and analysis. You would be required to work with our sciences team on complex cases from data science, machine learning, and business analytics. Mandatory R1. Must know in-and-out of functional programming (https://docs.python.org/2/howto/functional.html) in Python with strong flair for data structures, linear algebra, & algorithms implementation. Only oops cannot not be accepted. R2. Must have soiled hands on methods, functions, and workarounds in NumPy, Pandas, Scikit-learn, SciPy, Stasmodels - collectively you should have implemented atleast 100 different techniques (we averaged out this figure with our past aspirants who have worked on this role) R3. Must have implemented complex mathematical logics through functional map-reduce framework in Python R4. Must have understanding on EDA cycle, machine learning algorithms, hyper-parameter optimization, ensemble learning, regularization, predictions, clustering, associations - at essential level R5. Must have solved atleast five problems through data science & machine learning. Mere coursera learning and/or Kaggle offline attempts shall not be accepted Preferred R6. Good to have required callibre to learn PySpark within four weeks once joined us R7. Good to have required callibre to grasp underlying business for a problem to be solved R8. Good to have understanding on CNNs, RNNs, MLP, Auto-Encoders - at basic level R9. Good to have solved atleast three problems through deep learning. Mere coursera learning and/or Kaggle offline attempts shall not be accepted R10. Good to have worked on pre-processing techniques for images, audio, and text - OpenCV, Librosa, NLTK R11. Good to have used pre-trained models - VGGNET, Inception, ResNet, WaveNet, Word2Vec Ideal YOU Y1. Degree in engineering, or any other data-heavy field at Bachelors level or above from a top tier institute Y2. Relevant experience of 0 - 10 years working on real-world problems in a reputed company or a proven startup Y3. You are a fanatical implementer who love to spend time with content, codes & workarounds, more than your loved ones Y4. You are true believer that human intelligence can be augmented through computer science & mathematics and your survival vinaigrette depends on getting the most from the data Y5. You are an entrepreneur mindset with ownership, intellectuality, & creativity as way to work. These are not fancy words, we mean it Actual WE W1. Real startup with Meaningful products W2. Revolutionary not just disruptive W3. Rules creators not followers W4. Small teams with real brains not herd of blockheads W5. Completely trust us and should be trusted back Why Us In addition to the regular stuff which every good startup offers – Lots of learning, Food, Parties, Open culture, Flexible working hours, and what not…. We offer you: <Do your Greatest work of life> You shall be working on our revolutionary products which are pioneer in their respective categories. This is a fact. We try real hard to hire fun loving crazy folks who are driven by more than a paycheck. You shall be working with creamiest talent on extremely challenging problems at most happening workplace. How to Apply You should apply online by clicking "Apply Now". For queries regarding an open position, please write to careers@busigence.com For more information, visit http://www.busigence.com Careers: http://careers.busigence.com Research: http://research.busigence.com Jobs: http://careers.busigence.com/jobs/data-science Feel right fit for the position, mandatorily attach PDF resume highlighting your A. Key Skills B. Knowledge Inputs C. Major Accomplishments D. Problems Solved E. Submissions – Github/ StackOverflow/ Kaggle/ Euler Project etc. (if applicable) If you don't see this open position that interests you, join our Talent Pool and let us know how you can make a difference here. Referrals are more than welcome. Keep us in loop.

Job posted by
apply for job
apply for job
Seema Verma picture
Seema Verma
Job posted by
Seema Verma picture
Seema Verma
Apply for job
apply for job

Hadoop Engineers

Founded 2012
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[ - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
4 - 7 years
Experience icon
24 - 30 lacs/annum

Position Description Demonstrates up-to-date expertise in Software Engineering and applies this to the development, execution, and improvement of action plans Models compliance with company policies and procedures and supports company mission, values, and standards of ethics and integrity Provides and supports the implementation of business solutions Provides support to the business Troubleshoots business, production issues and on call support. Minimum Qualifications BS/MS in Computer Science or related field 5+ years’ experience building web applications Solid understanding of computer science principles Excellent Soft Skills Understanding the major algorithms like searching and sorting Strong skills in writing clean code using languages like Java and J2EE technologies. Understanding how to engineer the RESTful, Micro services and knowledge of major software patterns like MVC, Singleton, Facade, Business Delegate Deep knowledge of web technologies such as HTML5, CSS, JSON Good understanding of continuous integration tools and frameworks like Jenkins Experience in working with the Agile environments, like Scrum and Kanban. Experience in dealing with the performance tuning for very large-scale apps. Experience in writing scripting using Perl, Python and Shell scripting. Experience in writing jobs using Open source cluster computing frameworks like Spark Relational database design experience- MySQL, Oracle, SOLR, NoSQL - Cassandra, Mango DB and Hive. Aptitude for writing clean, succinct and efficient code. Attitude to thrive in a fun, fast-paced start-up like environment

Job posted by
apply for job
apply for job
Sampreetha Pai picture
Sampreetha Pai
Job posted by
Sampreetha Pai picture
Sampreetha Pai
Apply for job
apply for job

Hadoop Lead Engineers

Founded 2012
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[ - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
7 - 9 years
Experience icon
27 - 34 lacs/annum

Position Description Assists in providing guidance to small groups of two to three engineers, including offshore associates, for assigned Engineering projects Demonstrates up-to-date expertise in Software Engineering and applies this to the development, execution, and improvement of action plans Generate weekly, monthly and yearly report using JIRA and Open source tools and provide updates to leadership teams. Proactively identify issues, identify root cause for the critical issues. Work with cross functional teams, Setup KT sessions and mentor the team members. Co-ordinate with Sunnyvale and Bentonville teams. Models compliance with company policies and procedures and supports company mission, values, and standards of ethics and integrity Provides and supports the implementation of business solutions Provides support to the business Troubleshoots business, production issues and on call support. Minimum Qualifications BS/MS in Computer Science or related field 8+ years’ experience building web applications Solid understanding of computer science principles Excellent Soft Skills Understanding the major algorithms like searching and sorting Strong skills in writing clean code using languages like Java and J2EE technologies. Understanding how to engineer the RESTful, Micro services and knowledge of major software patterns like MVC, Singleton, Facade, Business Delegate Deep knowledge of web technologies such as HTML5, CSS, JSON Good understanding of continuous integration tools and frameworks like Jenkins Experience in working with the Agile environments, like Scrum and Kanban. Experience in dealing with the performance tuning for very large-scale apps. Experience in writing scripting using Perl, Python and Shell scripting. Experience in writing jobs using Open source cluster computing frameworks like Spark Relational database design experience- MySQL, Oracle, SOLR, NoSQL - Cassandra, Mango DB and Hive. Aptitude for writing clean, succinct and efficient code. Attitude to thrive in a fun, fast-paced start-up like environment

Job posted by
apply for job
apply for job
Sampreetha Pai picture
Sampreetha Pai
Job posted by
Sampreetha Pai picture
Sampreetha Pai
Apply for job
apply for job

Bigdata

via OpexAI
Founded 2017
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Hyderabad
Experience icon
0 - 1 years
Experience icon
1 - 1 lacs/annum

Bigdata, Business intelligence , python, R with their skills

Job posted by
apply for job
apply for job
Jasmine Shaik picture
Jasmine Shaik
Job posted by
Jasmine Shaik picture
Jasmine Shaik
Apply for job
apply for job

Senior Member of Technical Staff

Founded 2005
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
3 - 9 years
Experience icon
15 - 37 lacs/annum

SMTS – Data Analytics Platform About [24]7 Innovation Labs Data is changing human lives at the core - we collect so much data about everything, use it to learn many things and apply the learnings in all aspects of our lives. [24]7 is at the fore front of applying data and machine learning to the world of customer acquisition and customer engagement. Our customer acquisition cloud uses best of ML and AI to get the right audiences and our engagement cloud powers the interactions for best experience. We service Fortune 100 enterprises globally and hundreds of millions of their customers every year. We enable 1.5B customer interactions every year. We work on several challenging problems in the world of data processing, machine learning and use artificial intelligence to power Smart Agents. How do you process millions of events in a stream to derive intelligence? How do you learn from troves of data applying scalable machine learning algorithms? How do you switch the learnings with real time streams to make decisions in sub 300 msec at scale? We work with the best of open source technologies - Akka, Scala, Undertow, Spark, Spark ML, Hadoop, Cassandra, Mongo. Platform scale and real time are in our DNA and we work hard every day to change the game in customer engagement. We believe in empowering smart people to work on larger than life problems with the best of technologies and like-minded people. We are a Pre-IPO Silicon Valley based company with many global brands as our customers – Hilton, eBay, Time Warner Cable, Best Buy, Target, American Express, Capital One and United Airlines. We touch more than 300 M visitors online every month with our technologies. We have one of the best work environments in Bangalore. Eligibility: Bachelor’s or Master’s degree in Computer Science, Electrical Engineering, or equivalent Job Responsibilities: • Responsibilities include the design and development of various components of the big data platform. • Knowledge in Java/OO technologies, and experience developing in commercial software platforms and/or large scale data infrastructures. • Write efficient and quality code that scales to high volume production quality. • Work closely with multiple product management and engineering teams to lead the design, build and test of the components of the platform. • Research and experiment with emerging technologies and tools related to big data Required Traits: • 4 to 6 years of software development experience using multiple programming languages. • Strong understanding of Data Structures and Algorithms. • Experience in developing large scale J2EE data processing systems/applications. Experience with real time systems is preferred. • Must have experience in core java/Scala Spark, groovy & Angularjs • Should be proficient in MongoDB,Cassandra,Couchbase,mssql & SqlServer • Good to have knowledge of Druid,Postgres & Zeppelin • Experience with a one or more of big data architectures, including OpenStack, Hadoop, Pig, Hive or other big data frameworks • Ability to participate in large scale initiatives and work towards common goals • Excellent oral and written communication, presentation, and analytical skills

Job posted by
apply for job
apply for job
Achappa Bheemaiah picture
Achappa Bheemaiah
Job posted by
Achappa Bheemaiah picture
Achappa Bheemaiah
Apply for job
apply for job

Engineering Manager
at Uber

via Uber
Founded 2012
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
9 - 15 years
Experience icon
50 - 80 lacs/annum

Minimum 5+ years of experience as a manager and overall 10+ years of industry experience in a variety of contexts, during which you've built scalable, robust, and fault-tolerant systems. You have a solid knowledge of the whole web stack: front-end, back-end, databases, cache layer, HTTP protocol, TCP/IP, Linux, CPU architecture, etc. You are comfortable jamming on complex architecture and design principles with senior engineers. Bias for action. You believe that speed and quality aren't mutually exclusive. You've shown good judgement about shipping as fast as possible while still making sure that products are built in a sustainable, responsible way. Mentorship/ Guidance. You know that the most important part of your job is setting the team up for success. Through mentoring, teaching, and reviewing, you help other engineers make sound architectural decisions, improve their code quality, and get out of their comfort zone. Commitment. You care tremendously about keeping the Uber experience consistent for users and strive to make any issues invisible to riders. You hold yourself personally accountable, jumping in and taking ownership of problems that might not even be in your team's scope. Hiring know-how. You're a thoughtful interviewer who constantly raises the bar for excellence. You believe that what seems amazing one day becomes the norm the next day, and that each new hire should significantly improve the team. Design and business vision. You help your team understand requirements beyond the written word and you thrive in an environment where you can uncover subtle details.. Even in the absence of a PM or a designer, you show great attention to the design and product aspect of anything your team ships.

Job posted by
apply for job
apply for job
Swati Singh picture
Swati Singh
Job posted by
Swati Singh picture
Swati Singh
Apply for job
apply for job

Data Science Engineer (SDE I)

Founded 2017
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
1 - 3 years
Experience icon
12 - 20 lacs/annum

Couture.ai is building a patent-pending AI platform targeted towards vertical-specific solutions. The platform is already licensed by Reliance Jio and few European retailers, to empower real-time experiences for their combined >200 million end users. For this role, credible display of innovation in past projects (or academia) is a must. We are looking for a candidate who lives and talks Data & Algorithms, love to play with BigData engineering, hands-on with Apache Spark, Kafka, RDBMS/NoSQL DBs, Big Data Analytics and handling Unix & Production Server. Tier-1 college (BE from IITs, BITS-Pilani, top NITs, IIITs or MS in Stanford, Berkley, CMU, UW–Madison) or exceptionally bright work history is a must. Let us know if this interests you to explore the profile further.

Job posted by
apply for job
apply for job
Shobhit Agarwal picture
Shobhit Agarwal
Job posted by
Shobhit Agarwal picture
Shobhit Agarwal
Apply for job
apply for job

Senior Data Engineer (SDE II)

Founded 2017
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
2 - 7 years
Experience icon
15 - 30 lacs/annum

Couture.ai is building a patent-pending AI platform targeted towards vertical-specific solutions. The platform is already licensed by Reliance Jio and few European retailers, to empower real-time experiences for their combined >200 million end users. The founding team consists of BITS Pilani alumni with experience of creating global startup success stories. The core team, we are building, consists of some of the best minds in India in artificial intelligence research and data engineering. We are looking for multiple different roles with 2-7 year of research/large-scale production implementation experience with: - Rock-solid algorithmic capabilities. - Production deployments for massively large-scale systems, real-time personalization, big data analytics, and semantic search. - Or credible research experience in innovating new ML algorithms and neural nets. Github profile link is highly valued. For right fit into the Couture.ai family, compensation is no bar.

Job posted by
apply for job
apply for job
Shobhit Agarwal picture
Shobhit Agarwal
Job posted by
Shobhit Agarwal picture
Shobhit Agarwal
Apply for job
apply for job

Senior Backend Developer

Founded 2016
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
1 - 7 years
Experience icon
15 - 40 lacs/annum

RESPONSIBILITIES: 1. Full ownership of Tech right from driving product decisions to architect to deployment. 2. Develop cutting edge user experience and build cutting edge technology solutions like instant messaging in poor networks, live-discussions, live-videos optimal matching. 3. Using Billions of Data Points to Build User Personalisation Engine 4. Building Data Network Effects Engine to increase Engagement & Virality 5. Scaling the Systems to Billions of Daily Hits. 6. Deep diving into performance, power management, memory optimisation & network connectivity optimisation for the next Billion Indians 7. Orchestrating complicated workflows, asynchronous actions, and higher order components 8. Work directly with Product and Design teams REQUIREMENTS: 1. Should have Hacked some (computer or non-computer) system to your advantage. 2. Built and managed systems with a scale of 10Mn+ Daily Hits 3. Strong architectural experience 4. Strong experience in memory management, performance tuning and resource optimisations 5. PREFERENCE- If you are a woman or an ex-entrepreneur or having a CS bachelor’s degree from IIT/BITS/NIT P.S. If you don't fulfil one of the requirements, you need to be exceptional in the others to be considered.

Job posted by
apply for job
apply for job
Shubham Maheshwari picture
Shubham Maheshwari
Job posted by
Shubham Maheshwari picture
Shubham Maheshwari
Apply for job
apply for job

Python Developer

Founded 2015
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
4 - 8 years
Experience icon
12 - 25 lacs/annum

• Good experience in Python and SQL • Plus will be experience in Hive / Presto • Strong skills in using Python / R for building data pipelines and analysis • Good programming background - o Writing efficient and re-usable code o Comfort with working on the CLI and with tools like GitHub etc. Other softer aspects that are important - • Fast learner - No matter how much programming a person has done in the past, willing to learn new tools is the key • An eye for standardization and scalability of processes - the person will not need to do this alone but it will help us for everyone on the team to have this orientation • A generalist mindset - Everyone on the team will need to also work on front-end tools (Tableau and Unidash) so openness to playing a little outside the comfort zone

Job posted by
apply for job
apply for job
Jiten Chanana picture
Jiten Chanana
Job posted by
Jiten Chanana picture
Jiten Chanana
Apply for job
apply for job

Python ,SQL , Hive,Presto Analytics Background

Founded 2015
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
4 - 8 years
Experience icon
12 - 20 lacs/annum

Good programming background, writing efficient and re-usable code , comfort with working on CLI & GitHub

Job posted by
apply for job
apply for job
Shankar Raman picture
Shankar Raman
Job posted by
Shankar Raman picture
Shankar Raman
Apply for job
apply for job

Technical Architect/CTO

Founded 2016
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[1 - 1]}} employees
{{j_company_stages[1 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai
Experience icon
5 - 11 years
Experience icon
15 - 30 lacs/annum

ABOUT US: Arque Capital is a FinTech startup working with AI in Finance in domains like Asset Management (Hedge Funds, ETFs and Structured Products), Robo Advisory, Bespoke Research, Alternate Brokerage, and other applications of Technology & Quantitative methods in Big Finance. PROFILE DESCRIPTION: 1. Get the "Tech" in order for the Hedge Fund - Help answer fundamentals of technology blocks to be used, choice of certain platform/tech over other, helping team visualize product with the available resources and assets 2. Build, manage, and validate a Tech Roadmap for our Products 3. Architecture Practices - At startups, the dynamics changes very fast. Making sure that best practices are defined and followed by team is very important. CTO’s may have to garbage guy and clean the code time to time. Making reviews on Code Quality is an important activity that CTO should follow. 4. Build progressive learning culture and establish predictable model of envisioning, designing and developing products 5. Product Innovation through Research and continuous improvement 6. Build out the Technological Infrastructure for the Hedge Fund 7. Hiring and building out the Technology team 8. Setting up and managing the entire IT infrastructure - Hardware as well as Cloud 9. Ensure company-wide security and IP protection REQUIREMENTS: Computer Science Engineer from Tier-I colleges only (IIT, IIIT, NIT, BITS, DHU, Anna University, MU) 5-10 years of relevant Technology experience (no infra or database persons) Expertise in Python and C++ (3+ years minimum) 2+ years experience of building and managing Big Data projects Experience with technical design & architecture (1+ years minimum) Experience with High performance computing - OPTIONAL Experience as a Tech Lead, IT Manager, Director, VP, or CTO 1+ year Experience managing Cloud computing infrastructure (Amazon AWS preferred) - OPTIONAL Ability to work in an unstructured environment Looking to work in a small, start-up type environment based out of Mumbai COMPENSATION: Co-Founder status and Equity partnership

Job posted by
apply for job
apply for job
Hrishabh Sanghvi picture
Hrishabh Sanghvi
Job posted by
Hrishabh Sanghvi picture
Hrishabh Sanghvi
Apply for job
apply for job

Technical Architect

Founded 2006
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Pune
Experience icon
9 - 20+ years
Experience icon
13 - 25 lacs/annum

The hunt is for a AWS BigData /DWH Architect with the ability to manage effective relationships with a wide range of stakeholders (customers & team members alike). Incumbent will demonstrate personal commitment and accountability to ensure standards are continuously sustained and improved both within the internal teams, and with partner organizations and suppliers. We at Nitor Infotech a Product Engineering Services company are always on hunt for some best talents in the IT industry & keeping with our trend of What next in IT. We are scouting for result oriented resources with passion for product, technology services, and creating great customer experiences. Someone who can take our current expertise & footprint of Nitor Infotech Inc., to an altogether different dimension & level in tune with the emerging market trends and ensure Brilliance @ Work continues to prevail in whatever we do. Nitor Infotech works with global ISVs to help them build and accelerate their product development. Nitor is able to do so because of the fact that product development is its DNA. This DNA is enriched by its 10 years of expertise, best practices and frameworks & Accelerators. Because of this ability Nitor Infotech has been able to build business relationships with product companies having revenues from $50 Million to $1 Billion. • 7-12+ years of relevant experience of working in Database, BI and Analytics space with over 0-2 yrs of architecting and designing data warehouse experience including 2 to 3 yrs in Big Data ecosystem • Experience in data warehouse design in AWS • Strong architecting, programming, design skills and proven track record of architecting and building large scale, distributed big data solutions • Professional and technical advice on Big Data concepts and technologies, in particular highlighting the business potential through real-time analysis • Provides technical leadership in Big Data space (Hadoop Stack like M/R, HDFS, Pig, Hive, HBase, Flume, Sqoop, etc. NoSQL stores like Mongodb, Cassandra, HBase etc.) • Performance tuning of Hadoop clusters and Hadoop MapReduce routines. • Evaluate and recommend Big Data technology stack for the platform • Drive significant technology initiatives end to end and across multiple layers of architecture • Should have breadth of BI knowledge which includes:  MSBI, Database design, new visualization tools like Tableau, Qlik View, Power BI  Understand internals and intricacies of Old and New DB platform which includes:  Strong RDMS DB Fundamentals either of it SQL Server/ MySQL/ Oracle  DB and DWH design  Designing Semantic Model using OLAP and Tabular model using MS and Non MS tools  No SQL DBs including Document, Graph, Search and Columnar DBs • Excellent communication skills and strong ability to build good rapport with prospect and existing customers • Be a Mentor and go to person for Jr. team members in the team Qualification & Experience: · Educational qualification: BE/ME/B.Tech/M.Tech, BCA/MCA/BCS/MCS, any other degree with relevant IT qualification.

Job posted by
apply for job
apply for job
Balakumar Mohan picture
Balakumar Mohan
Job posted by
Balakumar Mohan picture
Balakumar Mohan
Apply for job
apply for job

Big Data Evangelist

Founded 2016
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Noida, Hyderabad, NCR (Delhi | Gurgaon | Noida)
Experience icon
2 - 6 years
Experience icon
4 - 12 lacs/annum

Looking for a technically sound and excellent trainer on big data technologies. Get an opportunity to become popular in the industry and get visibility. Host regular sessions on Big data related technologies and get paid to learn.

Job posted by
apply for job
apply for job
Suchit Majumdar picture
Suchit Majumdar
Job posted by
Suchit Majumdar picture
Suchit Majumdar
Apply for job
apply for job

Database Architect

Founded 2017
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
5 - 10 years
Experience icon
10 - 20 lacs/annum

candidate will be responsible for all aspects of data acquisition, data transformation, and analytics scheduling and operationalization to drive high-visibility, cross-division outcomes. Expected deliverables will include the development of Big Data ELT jobs using a mix of technologies, stitching together complex and seemingly unrelated data sets for mass consumption, and automating and scaling analytics into the GRAND's Data Lake. Key Responsibilities : - Create a GRAND Data Lake and Warehouse which pools all the data from different regions and stores of GRAND in GCC - Ensure Source Data Quality Measurement, enrichment and reporting of Data Quality - Manage All ETL and Data Model Update Routines - Integrate new data sources into DWH - Manage DWH Cloud (AWS/AZURE/Google) and Infrastructure Skills Needed : - Very strong in SQL. Demonstrated experience with RDBMS, Unix Shell scripting preferred (e.g., SQL, Postgres, Mongo DB etc) - Experience with UNIX and comfortable working with the shell (bash or KRON preferred) - Good understanding of Data warehousing concepts. Big data systems : Hadoop, NoSQL, HBase, HDFS, MapReduce - Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments. - Working with data delivery teams to set up new Hadoop users. This job includes setting up Linux users, setting up and testing HDFS, Hive, Pig and MapReduce access for the new users. - Cluster maintenance as well as creation and removal of nodes using tools like Ganglia, Nagios, Cloudera Manager Enterprise, and other tools. - Performance tuning of Hadoop clusters and Hadoop MapReduce routines. - Screen Hadoop cluster job performances and capacity planning - Monitor Hadoop cluster connectivity and security - File system management and monitoring. - HDFS support and maintenance. - Collaborating with application teams to install operating system and - Hadoop updates, patches, version upgrades when required. - Defines, develops, documents and maintains Hive based ETL mappings and scripts

Job posted by
apply for job
apply for job
Rahul Malani picture
Rahul Malani
Job posted by
Rahul Malani picture
Rahul Malani
Apply for job
apply for job

Senior Technologist @ Intelligent Travel Search startup

Founded 2016
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[1 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
5 - 15 years
Experience icon
6 - 18 lacs/annum

Key Skills Expected • Will be expected to architect, develop and maintain large-scale distributed systems • Should have excellent coding skills and good understanding of MVC frameworks • Strong understanding & experience in building efficient search & recommendation algorithms; experience in Machine/ Deep Learning would be beneficial • Experience in Python-Django would be a plus • Strong knowledge of hosting webservices like AWS, Google Cloud Platform, etc is critical. • Sound understanding of Front-end web technologies such as HTML, CSS, JavaScript, jQuery, AngularJS etc We are looking for self-starters who are looking to solve hard problems.

Job posted by
apply for job
apply for job
Varun Gupta picture
Varun Gupta
Job posted by
Varun Gupta picture
Varun Gupta
Apply for job
apply for job

Cassandra Engineer/Developer/Architect

Founded 2017
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[1 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
1 - 3 years
Experience icon
6 - 20 lacs/annum

www.aaknet.co.in/careers/careers-at-aaknet.html You are extra-ordinary, a rock-star, hardly found a place to leverage or challenge your potential, did not spot a sky rocketing opportunity yet? Come play with us – face the challenges we can throw at you, chances are you might be humiliated (positively); do not take it that seriously though! Please be informed, we rate CHARACTER, attitude high if not more than your great skills, experience and sharpness etc. :) Best wishes & regards, Team Aak!

Job posted by
apply for job
apply for job
Debdas Sinha picture
Debdas Sinha
Job posted by
Debdas Sinha picture
Debdas Sinha
Apply for job
apply for job

Python Developer

Founded
Products and services{{j_company_types[ - 1]}}
{{j_company_sizes[ - 1]}} employees
{{j_company_stages[ - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Chennai
Experience icon
2 - 7 years
Experience icon
6 - 18 lacs/annum

Full Stack Developer for Big Data Practice. Will include everything from architecture to ETL to model building to visualization.

Job posted by
apply for job
apply for job
Bavani T picture
Bavani T
Job posted by
Bavani T picture
Bavani T
Apply for job
apply for job

Python Developer

Founded 2015
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Pune
Experience icon
2 - 5 years
Experience icon
5 - 10 lacs/annum

We are an early stage startup working in the space of analytics, big data, machine learning, data visualization on multiple platforms and SaaS. We have our offices in Palo Alto and WTC, Kharadi, Pune and got some marque names as our customers. We are looking for really good Python programmer who MUST have scientific programming experience (Python, etc.) Hands-on with numpy and the Python scientific stack is a must. Demonstrated ability to track and work with 100s-1000s of files and GB-TB of data. Exposure to ML and Data mining algorithms. Need to be comfortable working in a Unix environment and SQL. You will be required to do following: Using command line tools to perform data conversion and analysis Supporting other team members in retrieving and archiving experimental results Quickly writing scripts to automate routine analysis tasks Creating insightful, simple graphics to represent complex trends Explore/design/invent new tools and design patterns to solve complex big data problems Experience working on a long-term, lab-based project (academic experience acceptable)

Job posted by
apply for job
apply for job
Nischal Vohra picture
Nischal Vohra
Job posted by
Nischal Vohra picture
Nischal Vohra
Apply for job
apply for job

Power Business Intelligent Developer

Founded 2014
Products and services{{j_company_types[2 - 1]}}
{{j_company_sizes[1 - 1]}} employees
{{j_company_stages[1 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Pune, Kharadi
Experience icon
2 - 5 years
Experience icon
4 - 7 lacs/annum

Should be able to create AWesome Dashboards - Should have hands on knowledge on all of the following: > Visualizations > Datasets > Reports > Dashboards > Tiles - Excellent Querying Skills using TSQL. - Should have prior exposure to SSRS and/or SSAS - Working knowledge of Microsoft Power Pivot, Power View, and Power BI Desktop.

Job posted by
apply for job
apply for job
Yogita Purandare picture
Yogita Purandare
Job posted by
Yogita Purandare picture
Yogita Purandare
Apply for job
apply for job
Why apply via CutShort?
Connect with actual hiring teams and get their fast response. No 3rd party recruiters. No spam.