Position : R & D - Senior EngineerReportsTo : Chief ArchitectExperience : 4+ YearsEducation : BE/ME/MSJob Summary :- We are seeking a highly-skilled, experienced Java developer to join our R & D team. In this role, you will help experiment various Proof of concept's by employing a lot of new and bleeding edge technologies. Compare with other similar technologies and draw merits and demerits. - Demonstrate MVP with small use cases, once reviewed and approved, design and develop a first cut solution that is scalable, relevant, and critical to our company's success and hand over to engineering team to take it forward and guide them to make it full-fledged product/service/solution. - You will focus on Java/ Java EE / Python development throughout and must have a solid skill set, problem solving ability, analytical thinking and a desire to continue to grow as a developer, and a team-player mentality. POC involves experimenting with bleeding edge technologies across languages like Java and Python. Duties and Responsibilities :- Provide solution's in terms of new technology/tool/service for current technology bottlenecks of the product(s)- Work on Proof of concepts for product / business requirements by employing latest technologies to understand it's fitment in product's in technology stack and evaluate its merits and demerits.- Gather requirements from internal and external stakeholders- Participate in the design and implementation of essential applications- Demonstrate expertise and add valuable input throughout the POC/development lifecycle- Help design and implement scalable, lasting technology solutions- Review current systems, suggesting updates as needed- Test and debug new applications and updates- Resolve reported issues and reply to queries in a timely manner- Develop and utilize technical change documentation- Strive to deploy all products and updates on time- Help improve code quality by implementing recommended best practices- Remain up to date on all current best practices, trends, and industry developments- Maintain a high standard of work quality and encourage others to do the same- Help junior team members grow and develop their skills- Identify potential challenges and bottlenecks to address them proactivelyRequirements and Qualifications :- BS/MS/MTech in computer science or related field required- Minimum 4 years of experience in reputed Software firm- Strong knowledge on computer science fundamentals like Algorithms and Data structures- Strong problem thinking and analytical thinking capability- Strong working knowledge of Java and J2EE technologies- Significant experience working with SQL - Significant experience working with NoSQL like mongo/dynamo/memsql/graph DB- Significant experience working with Elastic Cache- Significant experience working with Distributed Architecture- Knowledge or working experience in Python- Significant experience working with Web Services, REST Frameworks- Experience with AWS (S3, Lambda, Kinesis, SQS) highly desired- Experience with frameworks like Spring, Hadoop, Spark, Kafka a plus- Experience with Machine Learning, NLP a plus- Familiarity with Elasticsearch- Familiarity with Java web application servers like Tomcat, Weblogic, Jboss- Familiarity with micro services and/or Spring Boot- Familiarity with HTML, CSS, Java script- Having hobby projects is a plusManthan Profile :Manthan is the Chief Analytics Officer for consumer industries worldwide. Manthan's portfolio of analytics-enabled business applications, advanced analytics platforms and solutions are architected to help users across industries walk the complete data-to-result path - analyze, take guided decisions and execute these decisions real-time. Sophisticated, yet intuitive analytical capability coupled with the power of big data, mobility and cloud computing, brings users business-ready applications that provide on-demand access and real-time execution - the only path to profit in a contemporary, on-demand and connected economy. Manthan serves over 200 leading organizations across 23 countries. With the recent introduction of Maya, the world's first AI powered conversational agent for business analytics, Manthan is pioneering the move to zero touch UIs and transforming user interactions with complex analytics applications. Manthan is one of the most awarded analytics innovators among analysts and customers alike. To learn how businesses can gain from analytics, please visit https://www.manthan.com
Interested in building high performance search systems to handle petabytes of retail data, while working in an agile, small company environment? At CodeHall Technologies, you will have the opportunity to work with the newest technology in Search and Browse. We are working on systems that powers and personalizes site search, considering the user intent for every query, providing a wholly unique search experience that is engaging - designed to display the most relevant results through Findability. Primary responsibilities: Building high performance Search systems for personalization, optimization, and targeting Building systems with Hadoop, Solr, Cassandra, Flink, Spark, Mongo DB Deep understanding of HTTP and REST principles Good diagnostic and troubleshooting skills… Unit testing with JUnit, Performance testing and tuning Working with rapid and innovative development methodologies like: Kanban, Continuous Integration and Daily deployments Highly proficient Software engineering skills in Java Coordination with internal and external teams Mentoring junior engineers Participate in Product design discussions and decisions Minimum requirements: BS/MS in CS, Electrical Engineering or foreign equivalent plus relevant software development experience At least 5-8 years of software development experience Expert in Java, Scala or any other object oriented language Proficient in SQL concepts (HiveQL or Postgres a plus) Additional language skills for scripting and rapid application development Desired skills and experience: Working with large data sets in the PBs Familiarity with UNIX (systems skills a plus) Working experience in Solr, Cassandra, Mongo DB, and Hadoop Working in a distributed environment and has dealt with challenges around scaling and performance Proven ability to project and meet scheduled deadlines Self-driven, quick learner with attention to detail and quality
Role Brief: 6 + years of demonstrable experience designing technological solutions to complex data problems, developing & testing modular, reusable, efficient and scalable code to implement those solutions. Brief about Fractal & Team : Fractal Analytics is Leading Fortune 500 companies to leverage Big Data, analytics, and technology to drive smarter, faster and more accurate decisions in every aspect of their business. Our Big Data capability team is hiring technologists who can produce beautiful & functional code to solve complex analytics problems. If you are an exceptional developer and who loves to push the boundaries to solve complex business problems using innovative solutions, then we would like to talk with you. Job Responsibilities : Provides technical leadership in BigData space (Hadoop Stack like M/R, HDFS, Pig, Hive, HBase, Flume, Sqoop, etc..NoSQL stores like Cassandra, HBase etc) across Fractal and contributes to open source Big Data technologies. Visualize and evangelize next generation infrastructure in Big Data space (Batch, Near RealTime, RealTime technologies). Evaluate and recommend Big Data technology stack that would align with company's technology Passionate for continuous learning, experimenting, applying and contributing towards cutting edge open source technologies and software paradigms Drive significant technology initiatives end to end and across multiple layers of architecture Provides strong technical leadership in adopting and contributing to open source technologies related to BigData across the company. Provide strong technical expertise (performance, application design, stack upgrades) to lead Platform Engineering Defines and Drives best practices that can be adopted in BigData stack. Evangelizes the best practices across teams and BUs. Drives operational excellence through root cause analysis and continuous improvement for BigData technologies and processes and contributes back to open source community. Provide technical leadership and be a role model to data engineers pursuing technical career path in engineering Provide/inspire innovations that fuel the growth of Fractal as a whole EXPERIENCE : Must Have : Ideally, This Would Include Work On The Following Technologies Expert-level proficiency in at-least one of Java, C++ or Python (preferred). Scala knowledge a strong advantage. Strong understanding and experience in distributed computing frameworks, particularly Apache Hadoop 2.0 (YARN; MR & HDFS) and associated technologies -- one or more of Hive, Sqoop, Avro, Flume, Oozie, Zookeeper, etc.Hands-on experience with Apache Spark and its components (Streaming, SQL, MLLib) is a strong advantage. Operating knowledge of cloud computing platforms (AWS, especially EMR, EC2, S3, SWF services and the AWS CLI) Experience working within a Linux computing environment, and use of command line tools including knowledge of shell/Python scripting for automating common tasks Ability to work in a team in an agile setting, familiarity with JIRA and clear understanding of how Git works. A technologist - Loves to code and design In addition, the ideal candidate would have great problem-solving skills, and the ability & confidence to hack their way out of tight corners. Relevant Experience : Java or Python or C++ expertise Linux environment and shell scripting Distributed computing frameworks (Hadoop or Spark) Cloud computing platforms (AWS) Good to have : Statistical or machine learning DSL like R Distributed and low latency (streaming) application architecture Row store distributed DBMSs such as Cassandra Familiarity with API design Qualification: B.E/B.Tech/M.Tech in Computer Science or related technical degree OR Equivalent
ITTStar global services is subsidiary unit in Bengaluru with head office in Atlanta, Georgia. We are primarily into data management and data life cycle solutions, which includes machine learning and artificial intelligence. For further info, visit ITTstar.com . As discussed over the call, I am forwarding the job description. We are looking for enthusiastic and experienced data engineers to be part of our bustling team of professionals for our Bengaluru location. JOB DESCRIPTION: 1. Experience in Spark & Big Data is mandatory. 2. Strong Programming Skills in Python / Java / Scala /Node.js. 3. Hands on experience handling multiple data types JSON/XML/Delimited/Unstructured. 4. Hands on experience working at least one Relational and/or NoSQL Databases. 5. Knowledge on SQL Queries and Data Modeling. 6. Hands on experience working in ETL Use cases either in On-premise or Cloud. 7. Experience in any Cloud Platform (AWS, Azure, GCP, Alibaba). 8. Knowledge in one or more AWS Services like Kinesis, EC2, EMR, Hive Integration, Athena, FireHose, Lambda, S3, Glue Crawler, Redshift, RDS is a plus. 9. Good Communication Skills and Self Driven - should be able to deliver the projects with minimum instructions from Client.
Job Skill Requirements: • 4+ years of experience building and managing complex products/solutions • 2+ experience in DW/ELT/ETL technologies-Nice to have • 3+ years of hands on development experience using Big Data Technologies like: Hadoop, SPARK • 3+ years of hands on development experience using Big Data eco system components like: Hive, Impala,HBase, Sqoop, Oozie etc… • Proficient level programming in Scala. • Good to have hands on experience building webservices in Python/Scala stack. • Good to have experience developing Restful web services • Knowledge of web technologies and protocols (NoSQL/JSON/REST/JMS)
RESPONSIBILITIES: 1. Full ownership of Tech right from driving product decisions to architect to deployment. 2. Develop cutting edge user experience and build cutting edge technology solutions like instant messaging in poor networks, live-discussions, live-videos optimal matching. 3. Using Billions of Data Points to Build User Personalization Engine. 4. Building Data Network Effects Engine to increase Engagement & Virality. 5. Scaling the Systems to Billions of Daily Hits. 6. Deep diving into performance, power management, memory optimization & network connectivity optimization for the next Billion Indians. 7. Orchestrating complicated workflows, asynchronous actions, and higher order components. 8. Work directly with Product and Design teams. REQUIREMENTS: 1. Should have Hacked some (computer or non-computer) system to your advantage. 2. Built and managed systems with a scale of 10Mn+ Daily Hits 3. Strong architectural experience. 4. Strong experience in memory management, performance tuning and resource optimizations. 5. PREFERENCE- If you are a woman or an ex-entrepreneur or having a CS bachelors degree from IIT/BITS/NIT. P.S. If you don't fulfill one of the requirements, you need to be exceptional in the others to be considered.
Position Description Assists in providing guidance to small groups of two to three engineers, including offshore associates, for assigned Engineering projects Demonstrates up-to-date expertise in Software Engineering and applies this to the development, execution, and improvement of action plans Generate weekly, monthly and yearly report using JIRA and Open source tools and provide updates to leadership teams. Proactively identify issues, identify root cause for the critical issues. Work with cross functional teams, Setup KT sessions and mentor the team members. Co-ordinate with Sunnyvale and Bentonville teams. Models compliance with company policies and procedures and supports company mission, values, and standards of ethics and integrity Provides and supports the implementation of business solutions Provides support to the business Troubleshoots business, production issues and on call support. Minimum Qualifications BS/MS in Computer Science or related field 8+ years’ experience building web applications Solid understanding of computer science principles Excellent Soft Skills Understanding the major algorithms like searching and sorting Strong skills in writing clean code using languages like Java and J2EE technologies. Understanding how to engineer the RESTful, Micro services and knowledge of major software patterns like MVC, Singleton, Facade, Business Delegate Deep knowledge of web technologies such as HTML5, CSS, JSON Good understanding of continuous integration tools and frameworks like Jenkins Experience in working with the Agile environments, like Scrum and Kanban. Experience in dealing with the performance tuning for very large-scale apps. Experience in writing scripting using Perl, Python and Shell scripting. Experience in writing jobs using Open source cluster computing frameworks like Spark Relational database design experience- MySQL, Oracle, SOLR, NoSQL - Cassandra, Mango DB and Hive. Aptitude for writing clean, succinct and efficient code. Attitude to thrive in a fun, fast-paced start-up like environment
Couture.ai is building a patent-pending AI platform targeted towards vertical-specific solutions. The platform is already licensed by Reliance Jio and few European retailers, to empower real-time experiences for their combined >200 million end users. For this role, credible display of innovation in past projects (or academia) is a must. We are looking for a candidate who lives and talks Data & Algorithms, love to play with BigData engineering, hands-on with Apache Spark, Kafka, RDBMS/NoSQL DBs, Big Data Analytics and handling Unix & Production Server. Tier-1 college (BE from IITs, BITS-Pilani, top NITs, IIITs or MS in Stanford, Berkley, CMU, UW–Madison) or exceptionally bright work history is a must. Let us know if this interests you to explore the profile further.
Couture.ai is building a patent-pending AI platform targeted towards vertical-specific solutions. The platform is already licensed by Reliance Jio and few European retailers, to empower real-time experiences for their combined >200 million end users. The founding team consists of BITS Pilani alumni with experience of creating global startup success stories. The core team, we are building, consists of some of the best minds in India in artificial intelligence research and data engineering. We are looking for multiple different roles with 2-7 year of research/large-scale production implementation experience with: - Rock-solid algorithmic capabilities. - Production deployments for massively large-scale systems, real-time personalization, big data analytics, and semantic search. - Or credible research experience in innovating new ML algorithms and neural nets. Github profile link is highly valued. For right fit into the Couture.ai family, compensation is no bar.
Couture.ai is building a patent-pending AI platform targeted towards vertical-specific solutions. The platform is already licensed by Reliance Jio and few European retailers, to empower real-time experiences for their combined >200 million end users. For this role, credible display of innovation in past projects is a must. We are looking for hands-on leaders in data engineering with the 5-11 year of research/large-scale production implementation experience with: - Proven expertise in Spark, Kafka, and Hadoop ecosystem. - Rock-solid algorithmic capabilities. - Production deployments for massively large-scale systems, real-time personalization, big data analytics and semantic search. - Expertise in Containerization (Docker, Kubernetes) and Cloud Infra, preferably OpenStack. - Experience with Spark ML, Tensorflow (& TF Serving), MXNet, Scala, Python, NoSQL DBs, Kubernetes, ElasticSearch/Solr in production. Tier-1 college (BE from IITs, BITS-Pilani, IIITs, top NITs, DTU, NSIT or MS in Stanford, UC, MIT, CMU, UW–Madison, ETH, top global schools) or exceptionally bright work history is a must. Let us know if this interests you to explore the profile further.
RESPONSIBILITIES: 1. Full ownership of Tech right from driving product decisions to architect to deployment. 2. Develop cutting edge user experience and build cutting edge technology solutions like instant messaging in poor networks, live-discussions, live-videos optimal matching. 3. Using Billions of Data Points to Build User Personalisation Engine 4. Building Data Network Effects Engine to increase Engagement & Virality 5. Scaling the Systems to Billions of Daily Hits. 6. Deep diving into performance, power management, memory optimisation & network connectivity optimisation for the next Billion Indians 7. Orchestrating complicated workflows, asynchronous actions, and higher order components 8. Work directly with Product and Design teams REQUIREMENTS: 1. Should have Hacked some (computer or non-computer) system to your advantage. 2. Built and managed systems with a scale of 10Mn+ Daily Hits 3. Strong architectural experience 4. Strong experience in memory management, performance tuning and resource optimisations 5. PREFERENCE- If you are a woman or an ex-entrepreneur or having a CS bachelor’s degree from IIT/BITS/NIT P.S. If you don't fulfil one of the requirements, you need to be exceptional in the others to be considered.
candidate will be responsible for all aspects of data acquisition, data transformation, and analytics scheduling and operationalization to drive high-visibility, cross-division outcomes. Expected deliverables will include the development of Big Data ELT jobs using a mix of technologies, stitching together complex and seemingly unrelated data sets for mass consumption, and automating and scaling analytics into the GRAND's Data Lake. Key Responsibilities : - Create a GRAND Data Lake and Warehouse which pools all the data from different regions and stores of GRAND in GCC - Ensure Source Data Quality Measurement, enrichment and reporting of Data Quality - Manage All ETL and Data Model Update Routines - Integrate new data sources into DWH - Manage DWH Cloud (AWS/AZURE/Google) and Infrastructure Skills Needed : - Very strong in SQL. Demonstrated experience with RDBMS, Unix Shell scripting preferred (e.g., SQL, Postgres, Mongo DB etc) - Experience with UNIX and comfortable working with the shell (bash or KRON preferred) - Good understanding of Data warehousing concepts. Big data systems : Hadoop, NoSQL, HBase, HDFS, MapReduce - Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments. - Working with data delivery teams to set up new Hadoop users. This job includes setting up Linux users, setting up and testing HDFS, Hive, Pig and MapReduce access for the new users. - Cluster maintenance as well as creation and removal of nodes using tools like Ganglia, Nagios, Cloudera Manager Enterprise, and other tools. - Performance tuning of Hadoop clusters and Hadoop MapReduce routines. - Screen Hadoop cluster job performances and capacity planning - Monitor Hadoop cluster connectivity and security - File system management and monitoring. - HDFS support and maintenance. - Collaborating with application teams to install operating system and - Hadoop updates, patches, version upgrades when required. - Defines, develops, documents and maintains Hive based ETL mappings and scripts
zeotap helps telecom operators unlock the potential of their data safely across industries using privacy-by-design technology http://www.zeotap.com
Check our JD: https://www.zeotap.com/job/senior-tech-lead-m-f-for-zeotap/oEQK2fw0