About 7 Innovation Labs Data is changing human lives at the core - we collect so much data about everything, use it to learn many things and apply the learnings in all aspects of our lives. 7 is at the fore front of applying data and machine learning to the world of customer acquisition and customer engagement. Our customer acquisition cloud uses best of ML and AI to get the right audiences and our engagement cloud powers the interactions for best experience. We service Fortune 100 enterprises globally and hundreds of millions of their customers every year. We enable 1.5B customer interactions every year. We work on several challenging problems in the world of data processing, machine learning and use artificial intelligence to power Smart Agents. How do you process millions of events in a stream to derive intelligence? How do you learn from troves of data applying scalable machine learning algorithms? How do you switch the learnings with real time streams to make decisions in sub 300 msec at scale? We work with the best of open source technologies - Akka, Scala, Undertow, Spark, Spark ML, Hadoop, Cassandra, Mongo. Platform scale and real time are in our DNA and we work hard every day to change the game in customer engagement. We believe in empowering smart people to work on larger than life problems with the best of technologies and like-minded people. We are a Pre-IPO Silicon Valley based company with many global brands as our customers – Hilton, eBay, Time Warner Cable, Best Buy, Target, American Express, Capital One and United Airlines. We touch more than 300 M visitors online every month with our technologies. We have one of the best work environments in Bangalore. Opportunity Principal Member of Technical Staff is one of our distinguished individual contributors who can takes on problems of size and scale. You will be responsible for working with a team of smart and highly capable engineers to design a solution and work closely in the implementation, testing, deployment and runtime operation 24x7 with 99.99% uptime. You will have to demonstrate your technical mettle and influence and inspire the engineers to build things right. You will be working on the problems in one or more areas of : Data Collection: Horizontally scalable platform to collect Billions of events from around the world in as little as 50 msec. Intelligent Campaign Engines: Make real time decisions using the events on best experience to display in as little as 200 msec. Real time Stream Computation: Compute thousands of metrics on the incoming billions of events to make it available for decisioning and analytics. Data Pipeline: Scaleable data transport layer using Apache Kafka running across hundreds of servers and transporting billions of events in real time. Data Analysis: Distributed OLAP engines on Hadoop or Spark to provide real time analytics on the data Large scale Machine Learning: Supervised and Unsupervised learning on Hadoop and Spark using the best of open source frameworks. In this role, you will be presenting your work at Meetup events, Conferences worldwide and contributing to Open Source. You will be helping with attracting the right talent and grooming the engineers to shape up to be the best. Must Have Engineering • Strong foundation in Computer Science - through education and/or experience - Data Structures, Algorithms, Design thinking, Optimizations. • Should have been an outstanding technical contributor with accomplishments include building products and platforms of scale. • Outstanding technical acumen and deep understanding of problems with distributed systems and scale with strong orientation towards open source. • Experience building platforms that have 99.99% uptime requirements and have scale. • Experience in working in a fast paced environment with attention to detail and incremental delivery through automation. • Loves to code than to talk. • 10+ years of experience in building software systems or able to demonstrate such maturity without the years under the belt. Behavioral • Loads of energy and can-do attitude to take BIG problems by their horns and solve them. • Entrepreneurial spirit to conceive ideas, turn challenges into opportunities and build products. • Ability to inspire other engineers to do the unimagined and go beyond their comfort lines. • Be a role model for upcoming engineers in the organization especially new college grads. Technology background • Strong preference with experience in open source technologies: working with various Java application servers or Scala • Experience in deploying web applications, services that run across thousands of servers globally with very low latency and high uptime.
Couture.ai is building a patent-pending AI platform targeted towards vertical-specific solutions. The platform is already licensed by Reliance Jio and few European retailers, to empower real-time experiences for their combined >200 million end users. For this role, credible display of innovation in past projects (or academia) is a must. We are looking for a candidate who lives and talks Data & Algorithms, love to play with BigData engineering, hands-on with Apache Spark, Kafka, RDBMS/NoSQL DBs, Big Data Analytics and handling Unix & Production Server. Tier-1 college (BE from IITs, BITS-Pilani, top NITs, IIITs or MS in Stanford, Berkley, CMU, UW–Madison) or exceptionally bright work history is a must. Let us know if this interests you to explore the profile further.
Looking for senior data science researchers. Basic Qualifications: ∙Bachelors in Computer Science/Mathematics + Research (Machine Learning, Deep Learning, Statistics, Data Mining, Game Theory or core mathematical areas). ∙3+ years of relevant experience in building large scale machine learning or deep learning models and/or systems. ∙1 year or more of experience specifically with deep learning (CNN, RNN, LSTM, RBM etc). ∙Strong working knowledge of deep learning, machine learning, and statistics. - Deep domain understanding of Personalization, Search and Visual. ∙Strong math skills with statistical modelling / machine learning. ∙Hands-on experience building models with deep learning frameworks like MXNet or Tensorflow. ∙Experience in using Python, statistical/machine learning libs. ∙Ability to think creatively and solve problems. ∙Data presentation skills. Preferred: ∙MS/ Ph.D. (Machine Learning, Deep Learning, Statistics, Data Mining, Game Theory or core mathematical areas) from IISc and other Top Global Universities. ∙Or, Publications in highly accredited journals (If available, please share links to your published work.). ∙Or, history of scaling ML/Deep learning algorithm at massively large scale.
Couture.ai is building a patent-pending AI platform targeted towards vertical-specific solutions. The platform is already licensed by Reliance Jio and few European retailers, to empower real-time experiences for their combined >200 million end users. The founding team consists of BITS Pilani alumni with experience of creating global startup success stories. The core team, we are building, consists of some of the best minds in India in artificial intelligence research and data engineering. We are looking for multiple different roles with 2-7 year of research/large-scale production implementation experience with: - Rock-solid algorithmic capabilities. - Production deployments for massively large-scale systems, real-time personalization, big data analytics, and semantic search. - Or credible research experience in innovating new ML algorithms and neural nets. Github profile link is highly valued. For right fit into the Couture.ai family, compensation is no bar.
Couture.ai is building a patent-pending AI platform targeted towards vertical-specific solutions. The platform is already licensed by Reliance Jio and few European retailers, to empower real-time experiences for their combined >200 million end users. For this role, credible display of innovation in past projects is a must. We are looking for hands-on leaders in data engineering with the 5-11 year of research/large-scale production implementation experience with: - Proven expertise in Spark, Kafka, and Hadoop ecosystem. - Rock-solid algorithmic capabilities. - Production deployments for massively large-scale systems, real-time personalization, big data analytics and semantic search. - Expertise in Containerization (Docker, Kubernetes) and Cloud Infra, preferably OpenStack. - Experience with Spark ML, Tensorflow (& TF Serving), MXNet, Scala, Python, NoSQL DBs, Kubernetes, ElasticSearch/Solr in production. Tier-1 college (BE from IITs, BITS-Pilani, IIITs, top NITs, DTU, NSIT or MS in Stanford, UC, MIT, CMU, UW–Madison, ETH, top global schools) or exceptionally bright work history is a must. Let us know if this interests you to explore the profile further.
We are passionate technologists who believe in the power of software and technology as tools for social change. The 1000+ people in ThoughtWorks India are as diverse in personality as we are in our backgrounds, culture, and expertise. ThoughtWorks is a technology savvy company and we are always on a lookout for people who are passionate about coding, have strong object-oriented concepts and have hands-on coding experience in OOPS based language (Java, J2EE). The best source of information about ThoughtWorks is our corporate website. If you’re someone who’s inspired by technology, by joining ThoughtWorks, you become part of a community. People join because they get to talk to the people who wrote the books that influenced them, work with the people who wrote the tools they would like to use, and collaborate on projects that propel change in the real world. As an Application Developer at ThoughtWorks, you’ll get to: Think through hard problems in a consultancy environment, and work with amazing people to make the solutions a reality Work in a dynamic, collaborative, non-hierarchical environment where your talent is valued over your job title or years of experience Build custom software using the latest technologies and tools Craft your own career path You'll be responsible for: Creating complex, enterprise-transforming applications on diverse, high energy teams Using the latest tools and techniques Hands-on coding, usually in a pair programming environment Working in highly collaborative teams and building quality code Working in lots of different domains and client environments Understanding the business domain deeply and working closely with business stakeholders Ideally, you should you have: Minimum 3 years of development and delivery experience with Java / C# / Ruby / Python / Scala / Clojure / Django. Hands-on experience in analysis, design, coding, and implementation of complex, custom-built applications Great Object-oriented skills, including strong design patterns knowledge Familiarity with relational databases, preferably MySQL, NoSQL, Oracle, PostgreSQL or SQL Server Experience working with, or an interest in Agile Methodologies, such as Extreme Programming (XP) and Scrum Knowledge of software best practices, like Test-Driven Development (TDD) and Continuous Integration (CI) Strong communication and client-facing skills with the ability to work in a consulting environment is essential Desire to contribute to the wider technical community through collaboration, coaching, and mentoring of other technologists Senior developers (8+ years) are expected to be the Architect for relatively smaller enterprise level projects and for larger projects, you are expected to work closely with the fellow architects to come up with the architecture and take it further.
(Python OR Java OR scala) AND "machine learning" AND Spark
Software Engineer At Tekion, we believe that business applications can be as simple and cool as the best social or consumer app and yet be powerful to seamlessly and efficiently run large global businesses. To bring our belief to reality, we are building the world’s best business application on the cloud, using the entire spectrum of technology available on big-data, machine learning, human computer interaction (voice, touch, vision, sensors and IoT) and we are inventing technology to overcome anything that is limiting our goals. The end goal is, to enable and empower our customers and bring maximum value to their core business. Initially focused on transforming one industry vertical, for which we already have potential customers signed up. We have two of world’s largest companies in this industry investing in us along with some world renowned investors/ venture capitalists. We have a proven world class technology team that is solving complex problems, to make life super simple for our customers. If you love solving complex problems, and want to be part of the team that is shaping up the next generation business applications on the cloud and while having fun doing it, then we are looking for you. We are looking for talented Senior Software Engineers who want to be part of building the next generation business applications on the cloud. The Senior Software Engineer will use his/her passion and expertise for creating world class products. He/she will collaborate with the product and engineering teams and development managers to create the business application of the future. Key Responsibilities • Write high quality code and taking responsibility of their task • Own critical components and be responsible for the sub systems that you work on from design, code, testing, integration, deployment, enhancements etc. • Build large scalable applications for cloud deployment • Mentoring the team Minimum Qualification • Bachelor in Computer science or relevant fields • 5+ years of experience • Experience in React JS • Strong understating of XML, JSON, DOM and w3c standards • Experience in developing UI for ecommerce/social media/collaboration platform/etc., • Strong sense of ownership • Attitude for getting things done Preferred Qualification • • Bachelor/Masters in Computer Science • Experience in developing any enterprise systems like eCommerce, manufacturing, supply chain, etc., • Experience in web programming ( React JS) • Excellent understanding of performance and optimization techniques
US based Multinational Company Hands on Hadoop
Job Title: Distributed Systems Engineer - SDET Job Location: Pune, India Job Description: Are you looking to put your computer science skills to use? Are you looking to work for one of the hottest start-ups in Silicon Valley? Are you looking to define the next generation data management platform based on Apache Spark? Are you excited by the idea of being a Spark committer? If you answered yes to all of the questions above, we definitely want to talk to you. We are looking to add highly motivated engineers to work as a QE software engineer in our product development team in Pune. We work on cutting edge data management products that transform the way businesses operate. As a distributed systems engineer (if you are good) , you will get to work on defining key elements of our real time analytics platform, including 1. Distributed in memory data management 2. OLTP and OLAP querying in a single platform 3. Approximate Query Processing over large data sets 4. Online machine learning algorithms applied to streaming data sets 5. Streaming and continuous querying Requirements: 1. Experience in testing modern SQL, NewSQL products highly desirable 2. Experience with SQL language, JDBC, end to end testing of databases 3. Hands on Experience in writing SQL queries 4. Experience on database performance benchmarks like TPC-H, TPC-C and TPC-E a plus 5. Prior experience in benchmarking against Cassandra or MemSQL is a big plus 6. You should be able to program either in Java or have some exposure to functional programming in Scala 7. You should care about performance, and by that, we mean performance optimizations in a JVM 8. You should be self motivated and driven to succeed 9. If you are an open source committer on any project, especially an Apache project, you will fit right in 10. Experience working with Spark, SparkSQL, Spark Streaming is a BIG plus 11. Plans & authors Test plans and ensure testability is considered by development in all stages of the life cycle. 12. Plans, schedules and tracks the creations of Test plans / automation scripts using defined methodologies for manual and/or automated tests 13. Work as QE team member in troubleshooting, isolating, reproducing, tracking bugs and verifying fixes. 14. Analyze test results to ensure existing functionality and recommends corrective action. Documents test results, manages and maintains defect & test case databases to assist in process improvement and estimation of future releases. 15. Performs the assessment and planning of test efforts required for automation of new functions/features under development. Influences design changes to improve quality and feature testability. 16. If you have solved big complex problems, we want to talk to you 17. If you are a math geek, with a background in statistics, mathematics and you know what a linear regression is, this just might be the place for you 18. Exposure to stream data processing Storm, Samza is a plus Open source contributors: Send us your Github id Product: SnappyData is a new real-time analytics platform that combines probabilistic data structures, approximate query processing and in memory distributed data management to deliver powerful analytic querying and alerting capabilities on Apache Spark at a fraction of the cost of traditional big data analytics platforms. SnappyData fuses the Spark computational engine with a highly available, multi-tenanted in-memory database to execute OLAP and OLTP queries on streaming data. Further, SnappyData can store data in a variety of synopsis data structures to provide extremely fast responses on less resources. Finally, applications can either submit Spark programs or connect using JDBC/ODBC to run interactive or continuous SQL queries. Skills: 1. Distributed Systems, 2. Scala, 3. Apache Spark, 4. Spark SQL, 5. Spark Streaming, 6. Java, 7. YARN/Mesos What's in it for you: 1. Cutting edge work that is ultra meaningful 2. Colleagues who are the best of the best 3. Meaningful startup equity 4. Competitive base salary 5. Full benefits 6. Casual, Fun Office Company Overview: SnappyData is a Silicon Valley funded startup founded by engineers who pioneered the distributed in memory data business. It is advised by some of the legends of the computing industry who have been instrumental in creating multiple disruptions that have defined computing over the past 40 years. The engineering team that powers SnappyData built GemFire, one of the industry leading in memory data grids, which is used worldwide in mission critical applications ranging from finance to retail.
Looking for a technically sound and excellent trainer on big data technologies. Get an opportunity to become popular in the industry and get visibility. Host regular sessions on Big data related technologies and get paid to learn.
The candidate would need to work in a team developing clojure applications for Data related tasks.
Essential Job Functions: • Designs and develops digital solutions. • Develops software solutions. • Performs unit testing and test-driven development (TDD). • Troubleshoots and resolves technical problems. • Designs, maintains, and supports cloud infrastructure. • Collaborates with internal and external departments to accomplish various tasks and projects. Required Skills • Functional programming experience using Scala - required. • Backend development experience - preferred. • Search engine, indexing, and full text search experience - preferred. • Experience with and/or knowledge of AWS Cloud - required. Required Experience • Bachelor's degree in computer science, information technology, or related degree or equivalent experience and training. • (3 -5) years of Scala and Java programming experience - required. • (3 -5) years of integration and implementation experience
Scienaptic (www.scienaptic.com) is a new age technology and analytics company based in NY and Bangalore. Our mission is to infuse robust decision science into organizations. Our mantra to achieve our mission is to - reduce friction- among technology, processes and humans. We believe that good design thinking needs to permeate all aspects of our activities so that our customers get the best possible aesthetic and least frictious experience of our software and services. As a Prinicipal Software development Engineer you will be responsible for the development and augmentation of the software components which will be used to solve the analytics problems of large enterprises. These components are highly scalable, connect with multiple data sources and implement some of the complex algorithms We are funded by very senior and eminent business leaders in India and US. Our lead investor is Pramod Bhasin, who is known as a pioneer of ITES revolution. We have the working environment of a new age, cool startup. We are firm believers that the best talent grounds will be non-hierarchical in structure and spirit. We expect you to enjoy, thrive and empower others by progressing that culture. Requirements : - Candidate should have all round experience in developing and delivering large-scale business applications in scale-up systems as well as scale-out distributed systems. - Identify the appropriate software technology / tools based on the requirements and design elements contained in a system specification - Should implement complex algorithms in a scalable fashion. - Work closely with product and Analytic managers, user interaction designers, and other software engineers to develop new product offerings and improve existing ones. Qualifications/Experience : - Bachelor's or Master's degree in computer science or related field - 10 to 12 years of experience in core Java programming: JDK 1.7/JDK 1.8 and Familiarity with Big data systems like Hadoop and Spark is an added bonus - Familiarity with dependency injection, Concurrency, Guice/Spring - Familiarity with JDBC API / Databases like MySQL, Oracle, Hadoop - Knowledge of graph databases and traversal - Knowlede of SOLR/ElasticSearch, Cloud based deployment would be preferred
Greetings from Info Vision labs InfoVision was founded in 1995 by technology professionals with a vision to provide quality and cost-effective IT solutions worldwide. InfoVision is a global IT Services and Solutions company with primary focus on Strategic Resources, Enterprise Applications and Technology Solutions. Our core practice areas include Applications Security, Business Analytics, Visualization & Collaboration and Wireless & IP Communications. Our IT services cover the full range of needs of enterprises, from Staffing to Solutions. Over the past decade, our ability to serve our clients has steadily evolved. It now covers multiple industries, numerous geographies and flexible delivery models, as well as the state-of-the-art technologies. InfoVision opened its development and delivery center in 2014, at Pune and has been expanding with project engagements with clients based in US and India. We can offer the right individuals an industry leading package and fast career growth prospects. Please get to know about us at - http://infovisionlabs.com/about/
We at InfoVision Labs, are passionate about technology and what our clients would like to get accomplished. We continuously strive to understand business challenges, changing competitive landscape and how the cutting edge technology can help position our client to the forefront of the competition.We are a fun loving team of Usability Experts and Software Engineers, focused on Mobile Technology, Responsive Web Solutions and Cloud Based Solutions. Job Responsibilities: ◾Minimum 3 years of experience in Big Data skills required. ◾Complete life cycle experience with Big Data is highly preferred ◾Skills – Hadoop, Spark, “R”, Hive, Pig, H-Base and Scala ◾Excellent communication skills ◾Ability to work independently with no-supervision.
Check our JD: https://www.zeotap.com/job/senior-tech-lead-m-f-for-zeotap/oEQK2fw0
Expect the Best. At Target, we have a vision: to become the best - the best culture and brand, the best place for growth and the company with the best reputation. We offer an inclusive, collaborative and energetic work environment that rewards those who perform. We deliver engaging, innovative and on-trend experiences for our team members and our guests. We invest in our team members' futures by developing leaders and providing a breadth of opportunities for professional development. It takes the best to become the best, and we are committed to building a team that does the right thing for our guests, shareholders, team members and communities. Minneapolis-based Target Corporation serves guests at stores nationwide and at Target.com. Target is committed to providing a fun and convenient shopping experience with access to unique and highly differentiated products at affordable prices. Since 1946, the corporation has given 5 percent of its income through community grants and programs like Take Charge of Education®.
Our Ventures IT SOLUTIONS IT Consultation, Technology Development, Process Outsourcing. REAL ESTATE & BRAND through Land Weavers. EDUCATION SOLUTIONS Knowledge Icon, Dream tech labs and Animation bugs. INSURANCE TRAINING SOLUTIONS through Dreamweaversindia.com TELECOM SOLUTIONS through DreamTel DIGITAL MEDIA SOLUTIONS through Dream Media
Dreamweavers is a group of hard core professionals who dream high and achieve even higher.Our Ventures IT SOLUTIONS IT Consultation, Technology Development, Process Outsourcing. REAL ESTATE & BRAND through Land Weavers. EDUCATION SOLUTIONS Knowledge Icon, Dream tech labs and Animation bugs. INSURANCE TRAINING SOLUTIONS through Dreamweaversindia.com TELECOM SOLUTIONS through DreamTel DIGITAL MEDIA SOLUTIONS through Dream Media
Crest (Part of the Springer Nature group):-Headquartered in Pune, Crest is a Springer Nature company that delivers cutting edge IT and ITeS solutions to some of the biggest scientific content and database brands in the world. Our global teams work closely with our counterparts and clients in Europe, USA and New Zealand, leveraging the latest technology, marketing intelligence and subject matter expertise. With handpicked SME’s in a range of sciences and technology teams working on the latest ECM, Scala, SAP and MS Tech platforms, Crest not only develops quality STM content, but continuously enhances the channels though which they are delivered to the world. Crest is an ISO 9001 certified, driven by over 1000 professionals in Technology, Research& Analysis and Marketing & BPM. Specialties: 1. Technology 2. Research 3. Marketing Intelligence 4. Business Process Management
Develop analytic tools, working on BigData and Distributed systems. - Provide technical leadership on developing our core Analytic platform - Lead development efforts on product features using Scala/Java -Demonstrable excellence in innovation, problem solving, analytical skills, data structures and design patterns - Expert in building applications using Spark and Spark Streaming -Exposure to NoSQL: HBase/Cassandra, Hive and Pig -Latin, Mahout -Extensive experience with Hadoop and Machine learning algorithms
New team getting formed. Great learning opportunity to work on building team with Engineering focus. Multitude of opportunities across technology spectrum.
idea has a unique approach and it can be a big move in future on technology platform.
The Microsoft Office India team located in Hyderabad India (IDC) is building a set of next generation experiences. • Are you fascinated by having to build highly scalable APIs on a reliable stack that can fallback from persistent connections to SMS? • Can you build and run Services infrastructure that can scale to billions of transactions per day? • Can you build UI infrastructure that can be extended in infinite ways? We are part of the group whose mission is to reimagine productivity applications on mobile devices for emerging markets. A solid engineering culture, a fun set of people and solving tough problems are part of the deal and you will find it hard to say no to. If you have the technical chops, we would love to hear from you.