Job description What do Microsoft, The Gap, Royal Bank of Scotland, Lockheed Martin, and top Open Source projects like JQuery have in common? They all use BrowserStack, as do over 25,000 other customers! BrowserStack is the industry-leading, cloud web and mobile testing platform that enables developers to test their websites and apps on different operating systems and mobile devices. Since launching in 2011, our mission has been bold yet simple: To be the testing infrastructure for the internet. 6 years and billions of tests later, we are ready for our next phase of hyper growth. Bootstrapped for the first 6 years, we continue to be profitable since inception with a near exponential growth in customers around the globe. Recently, we closed $50M in Series A funding from Accel Are you are the one who wants to work on a tech-heavy product, and the challenging technical problems that it entails? Problems vary, and can be as diverse as scaling the product smoothly as the company grows, to setting up a device farm, or solving streaming issues through a browser, without the use of plugins. Will you be excited for the challenge to transform and think critically on many computer science disciplines including product design, usability, building APIs and user-centric online applications, business logic, scaling performance, and 24x7 reliability? Job Responsibilities: Works on the web application layer, backend, systems, streaming and other associated technology to build our product and components Find solutions and solve issues around a variety of operating systems or programming languages Researching new technologies and adapt them to BrowserStack requirements Own and commit to all your work, and be accountable for your results Using and understanding code from Open Source Teaching others how to use new software Willing to learn new programming languages and databases Able to write efficient SQL queries and design schemas for relational databases Participate in a culture of code reviews, writing tech specs, and collaborating closely with other people (no lone wolves) Produce high quality software that is unit tested, code reviewed, and checked in regularly for continuous integration Develop multitier scalable, highvolume performing, and reliable usercentric applications that operate 24x7 Scale distributed applications, make architectural tradeoffs applying synchronous and asynchronous design patterns, write code, and deliver with speediness and quality REQUIREMENTS Good experience in at least one scripting language: Ruby, Nodejs, Python, AppleScript, Unix shell or similar Familiarity with one compiled language: C, Java, C ++, Go or similar Good knowledge of operating systems and networking concepts Reasonable knowledge of Windows and/or Linux operating systems Ability to work on Windows and Linux platform below the application layer, including file systems, kernels, custom installations, shell scripting, internal APIs, etc Aggressive problem diagnosis and creative problem solving skills Startup mentality, high willingness to learn, and hardworking Experience of 2+ years BENEFITS Our benefits include a competitive salary, bonus and equity program, 100% company paid medical insurance, a flexible and generous vacation policy, daily catered lunch, free snacks etc
Senior Engineer - Development Our Company We help people around the world save money and live better -- anytime and anywhere -- in retail stores, online and through their mobile devices. Each week, more than 220 million customers and members visit our 11,096 stores under 69 banners in 27 countries and e-commerce websites in 10 countries. With last fiscal revenues of approximately $486 billion, Walmart employs 2.2 million employees worldwide. @ Walmart Labs in Bangalore, we use technology for the charter of building brand new platforms and services on the latest technology stack to support both our stores and e-commerce businesses worldwide. Our Team The Global Data and Analytics Platforms (GDAP) team @ Walmart Labs in Bangalore provides Data Foundation Infrastructure, Visualization Portal, Machine Learning Platform, Customer platform and Data Science products that form part of core platforms and services that drive Walmart business. The group also develops analytical products for several verticals like supply chain, pricing, customer, HR etc. Our team which is part of GDAP Bangalore is responsible for creating the Customer Platform which is a one stop shop for all customer analytics for Walmart stores, a Machine Learning Platform that provides end-to-end infrastructure for Data Scientists to build ML solutions and an Enterprise Analytics group that provides analytics for HR, Global Governance and Security. The team is responsible for time critical, business critical and highly reliable systems that influence almost every part of the Walmart business. The team is spread over multiple locations and the Bangalore centre owns critical end to end pieces, that we design, build and support. Your Opportunity As part of the Customer Analytics Team @Walmart Labs, you’ll have the opportunity to make a difference by being a part of development team that builds products at Walmart scale, which is the foundation of Customer Analytics across Walmart. One of the key attribute of this job is that you are required to continuously innovate and apply technology to provide business 360 view of Walmart customers. Your Responsibility • Design, build, test and deploy cutting edge solutions at scale, impacting millions of customers worldwide drive value from data at Walmart Scale • Interact with Walmart engineering teams across geographies to leverage expertise and contribute to the tech community. • Engage with Product Management and Business to drive the agenda, set your priorities and deliver awesome product features to keep platform ahead of market scenarios. • Identify right open source tools to deliver product features by performing research, POC/Pilot and/or interacting with various open source forums • Develop and/or Contribute to add features that enable customer analytics at Walmart scale • Deploy and monitor products on Cloud platforms • Develop and implement best-in-class monitoring processes to enable data applications meet SLAs Our Ideal Candidate You have a deep interest and passion for technology. You love writing and owning codes and enjoy working with people who will keep challenging you at every stage. You have strong problem solving, analytic, decision-making and excellent communication with interpersonal skills. You are self-driven and motivated with the desire to work in a fast-paced, results-driven agile environment with varied responsibilities. Your Qualifications • Bachelor's Degree and 7+ yrs. of experience or Master’s Degree with 6+ years of experience in Computer Science or related field • Expertise in Big Data Ecosystem with deep experience in Java, Hadoop, Spark, Storm, Cassandra, NoSQL etc. • Expertise in MPP architecture and knowledge of MPP engine (Spark, Impala etc). • Experience in building scalable/highly available distributed systems in production. • Understanding of stream processing with expert knowledge on Kafka and either Spark streaming or Storm. • Experience with SOA. • Knowledge of graph database neo4j, Titan is definitely a plus. • Knowledge of Software Engineering best practices with experience on implementing CI/CD, Log aggregation/Monitoring/alerting for production system.
- 3+ years of appropriate technical experience - Strong proficiency with Core Java or with Scala on Spark (Hadoop) - Database experience preferably with DB2, Sybase, or Oracle - Complete SDLC process and Agile Methodology (Scrum) - Strong oral and written communication skills - Excellent interpersonal skills and professional approach
Job Description You will join the core product team to design and develop Mobikon's products that are changing how restaurants run their business. As part of a small team, you’ll be involved in the entire product lifecycle from conception to prototyping to development to launch. You will be part of building a platform for interacting with third-party partners by defining APIs, managing other toolkits. You’ll have the freedom to make key product decisions and be given significant product ownership. We believe in Agile, short release cycles and quick iterations. Like all good startups, we work hard and move fast in an organized chaos; but you’ll have fun and have a big influence on the success of the company. We are solving hard problems and you will often face exciting, even daunting, challenges. There's A LOT of room for cool innovation in our space and we love to work with people who are brimming with ideas and not shy about sharing them. We look for people who are driven by a passion to build awesome products that delight our users. Responsibilities - Design and develop elegant backend code which is well documented and write tests for same. - Build the core framework for a high-scale, highly reliable web service. - Work with many third parties, partners to design and develop outbound/inbound integrations. - Research and develop new approaches to problems. - Take start-to-finish technical ownership of features and/or applications from inception to delivery. - Do what it takes to get things done . (pick up new technologies, work with front end technologies as and when required). Qualifications - Degree in computer science or equivalent. - A solid foundation in computer science, with strong competencies in data structures, algorithms, analysis, and software design and excellent problem-solving skills. - 3+ years experience of building web applications with any high level language. - Experience exposing functionality through RESTful interfaces. - Skilled in variety of data stores, database design, data modelling and SQL queries. - Have strong troubleshooting skills and attention to detail. - Strong written and oral communication skills, ability to multi-task, establish priorities and meet tight deadline. - Fast learner; independent and self-teaching. - Team player - ability to work cooperatively with the other engineers
About 7 Innovation Labs Data is changing human lives at the core - we collect so much data about everything, use it to learn many things and apply the learnings in all aspects of our lives. 7 is at the fore front of applying data and machine learning to the world of customer acquisition and customer engagement. Our customer acquisition cloud uses best of ML and AI to get the right audiences and our engagement cloud powers the interactions for best experience. We service Fortune 100 enterprises globally and hundreds of millions of their customers every year. We enable 1.5B customer interactions every year. We work on several challenging problems in the world of data processing, machine learning and use artificial intelligence to power Smart Agents. How do you process millions of events in a stream to derive intelligence? How do you learn from troves of data applying scalable machine learning algorithms? How do you switch the learnings with real time streams to make decisions in sub 300 msec at scale? We work with the best of open source technologies - Akka, Scala, Undertow, Spark, Spark ML, Hadoop, Cassandra, Mongo. Platform scale and real time are in our DNA and we work hard every day to change the game in customer engagement. We believe in empowering smart people to work on larger than life problems with the best of technologies and like-minded people. We are a Pre-IPO Silicon Valley based company with many global brands as our customers – Hilton, eBay, Time Warner Cable, Best Buy, Target, American Express, Capital One and United Airlines. We touch more than 300 M visitors online every month with our technologies. We have one of the best work environments in Bangalore. Opportunity Principal Member of Technical Staff is one of our distinguished individual contributors who can takes on problems of size and scale. You will be responsible for working with a team of smart and highly capable engineers to design a solution and work closely in the implementation, testing, deployment and runtime operation 24x7 with 99.99% uptime. You will have to demonstrate your technical mettle and influence and inspire the engineers to build things right. You will be working on the problems in one or more areas of : Data Collection: Horizontally scalable platform to collect Billions of events from around the world in as little as 50 msec. Intelligent Campaign Engines: Make real time decisions using the events on best experience to display in as little as 200 msec. Real time Stream Computation: Compute thousands of metrics on the incoming billions of events to make it available for decisioning and analytics. Data Pipeline: Scaleable data transport layer using Apache Kafka running across hundreds of servers and transporting billions of events in real time. Data Analysis: Distributed OLAP engines on Hadoop or Spark to provide real time analytics on the data Large scale Machine Learning: Supervised and Unsupervised learning on Hadoop and Spark using the best of open source frameworks. In this role, you will be presenting your work at Meetup events, Conferences worldwide and contributing to Open Source. You will be helping with attracting the right talent and grooming the engineers to shape up to be the best. Must Have Engineering • Strong foundation in Computer Science - through education and/or experience - Data Structures, Algorithms, Design thinking, Optimizations. • Should have been an outstanding technical contributor with accomplishments include building products and platforms of scale. • Outstanding technical acumen and deep understanding of problems with distributed systems and scale with strong orientation towards open source. • Experience building platforms that have 99.99% uptime requirements and have scale. • Experience in working in a fast paced environment with attention to detail and incremental delivery through automation. • Loves to code than to talk. • 10+ years of experience in building software systems or able to demonstrate such maturity without the years under the belt. Behavioral • Loads of energy and can-do attitude to take BIG problems by their horns and solve them. • Entrepreneurial spirit to conceive ideas, turn challenges into opportunities and build products. • Ability to inspire other engineers to do the unimagined and go beyond their comfort lines. • Be a role model for upcoming engineers in the organization especially new college grads. Technology background • Strong preference with experience in open source technologies: working with various Java application servers or Scala • Experience in deploying web applications, services that run across thousands of servers globally with very low latency and high uptime.
Our Company We help people around the world save money and live better -- anytime and anywhere -- in retail stores, online and through their mobile devices. Each week, more than 220 million customers and members visit our 11,096 stores under 69 banners in 27 countries and e-commerce websites in 10 countries. With last fiscal revenues of approximately $486 billion, Walmart employs 2.2 million employees worldwide. @ Walmart Labs in Bangalore, we use technology for the charter of building brand new platforms and services on the latest technology stack to support both our stores and e-commerce businesses worldwide. Our Team The Global Data and Analytics Platforms (GDAP) team @ Walmart Labs in Bangalore provides Data Foundation Infrastructure, Visualization Portal, Machine Learning Platform, Customer platform and Data Science products that form part of core platforms and services that drive Walmart business. The group also develops analytical products for several verticals like supply chain, pricing, customer, HR etc. Our team which is part of GDAP Bangalore is responsible for creating the Customer Platform which is a one stop shop for all customer analytics for Walmart stores, a Machine Learning Platform that provides end-to-end infrastructure for Data Scientists to build ML solutions and an Enterprise Analytics group that provides analytics for HR, Global Governance and Security. The team is responsible for time critical, business critical and highly reliable systems that influence almost every part of the Walmart business. The team is spread over multiple locations and the Bangalore centre owns critical end to end pieces, that we design, build and support. Your Opportunity As part of the Customer Analytics Team @Walmart Labs, you’ll have the opportunity to make a difference by being a part of development team that builds products at Walmart scale, which is the foundation of Customer Analytics across Walmart. One of the key attribute of this job is that you are required to continuously innovate and apply technology to provide business 360 view of Walmart customers. Your Responsibility • Design, build, test and deploy cutting edge solutions at scale, impacting millions of customers worldwide drive value from data at Walmart Scale • Interact with Walmart engineering teams across geographies to leverage expertise and contribute to the tech community. • Engage with Product Management and Business to drive the agenda, set your priorities and deliver awesome product features to keep platform ahead of market scenarios. • Identify right open source tools to deliver product features by performing research, POC/Pilot and/or interacting with various open source forums • Develop and/or Contribute to add features that enable customer analytics at Walmart scale • Deploy and monitor products on Cloud platforms • Develop and implement best-in-class monitoring processes to enable data applications meet SLAs Our Ideal Candidate You have a deep interest and passion for technology. You love writing and owning codes and enjoy working with people who will keep challenging you at every stage. You have strong problem solving, analytic, decision-making and excellent communication with interpersonal skills. You are self-driven and motivated with the desire to work in a fast-paced, results-driven agile environment with varied responsibilities. Your Qualifications • Bachelor's Degree and 7+ yrs. of experience or Master’s Degree with 6+ years of experience in Computer Science or related field • Expertise in Big Data Ecosystem with deep experience in Java, Hadoop, Spark, Storm, Cassandra, NoSQL etc. • Expertise in MPP architecture and knowledge of MPP engine (Spark, Impala etc). • Experience in building scalable/highly available distributed systems in production. • Understanding of stream processing with expert knowledge on Kafka and either Spark streaming or Storm. • Experience with SOA. • Knowledge of graph database neo4j, Titan is definitely a plus. • Knowledge of Software Engineering best practices with experience on implementing CI/CD, Log aggregation/Monitoring/alerting for production system.
Couture.ai is building a patent-pending AI platform targeted towards vertical-specific solutions. The platform is already licensed by Reliance Jio and few European retailers, to empower real-time experiences for their combined >200 million end users. For this role, credible display of innovation in past projects (or academia) is a must. We are looking for a candidate who lives and talks Data & Algorithms, love to play with BigData engineering, hands-on with Apache Spark, Kafka, RDBMS/NoSQL DBs, Big Data Analytics and handling Unix & Production Server. Tier-1 college (BE from IITs, BITS-Pilani, top NITs, IIITs or MS in Stanford, Berkley, CMU, UW–Madison) or exceptionally bright work history is a must. Let us know if this interests you to explore the profile further.
Looking for senior data science researchers. Basic Qualifications: ∙Bachelors in Computer Science/Mathematics + Research (Machine Learning, Deep Learning, Statistics, Data Mining, Game Theory or core mathematical areas) from Tier1 tech institutes. ∙3+ years of relevant experience in building large scale machine learning or deep learning models and/or systems. ∙1 year or more of experience specifically with deep learning (CNN, RNN, LSTM, RBM etc). ∙Strong working knowledge of deep learning, machine learning, and statistics. - Deep domain understanding of Personalization, Search and Visual. ∙Strong math skills with statistical modeling / machine learning. ∙Hands-on experience building models with deep learning frameworks like MXNet or Tensorflow. ∙Experience in using Python, statistical/machine learning libs. ∙Ability to think creatively and solve problems. ∙Data presentation skills. Preferred: ∙MS/ Ph.D. (Machine Learning, Deep Learning, Statistics, Data Mining, Game Theory or core mathematical areas) from IISc and other Top Global Universities. ∙Or, Publications in highly accredited journals (If available, please share links to your published work.). ∙Or, history of scaling ML/Deep learning algorithm at massively large scale.
Couture.ai is building a patent-pending AI platform targeted towards vertical-specific solutions. The platform is already licensed by Reliance Jio and few European retailers, to empower real-time experiences for their combined >200 million end users. The founding team consists of BITS Pilani alumni with experience of creating global startup success stories. The core team, we are building, consists of some of the best minds in India in artificial intelligence research and data engineering. We are looking for multiple different roles with 2-7 year of research/large-scale production implementation experience with: - Rock-solid algorithmic capabilities. - Production deployments for massively large-scale systems, real-time personalization, big data analytics, and semantic search. - Or credible research experience in innovating new ML algorithms and neural nets. Github profile link is highly valued. For right fit into the Couture.ai family, compensation is no bar.
Couture.ai is building a patent-pending AI platform targeted towards vertical-specific solutions. The platform is already licensed by Reliance Jio and few European retailers, to empower real-time experiences for their combined >200 million end users. For this role, credible display of innovation in past projects is a must. We are looking for hands-on leaders in data engineering with the 5-11 year of research/large-scale production implementation experience with: - Proven expertise in Spark, Kafka, and Hadoop ecosystem. - Rock-solid algorithmic capabilities. - Production deployments for massively large-scale systems, real-time personalization, big data analytics and semantic search. - Expertise in Containerization (Docker, Kubernetes) and Cloud Infra, preferably OpenStack. - Experience with Spark ML, Tensorflow (& TF Serving), MXNet, Scala, Python, NoSQL DBs, Kubernetes, ElasticSearch/Solr in production. Tier-1 college (BE from IITs, BITS-Pilani, IIITs, top NITs, DTU, NSIT or MS in Stanford, UC, MIT, CMU, UW–Madison, ETH, top global schools) or exceptionally bright work history is a must. Let us know if this interests you to explore the profile further.
We are passionate technologists who believe in the power of software and technology as tools for social change. The 1000+ people in ThoughtWorks India are as diverse in personality as we are in our backgrounds, culture, and expertise. ThoughtWorks is a technology savvy company and we are always on a lookout for people who are passionate about coding, have strong object-oriented concepts and have hands-on coding experience in OOPS based language (Java, J2EE). The best source of information about ThoughtWorks is our corporate website. If you’re someone who’s inspired by technology, by joining ThoughtWorks, you become part of a community. People join because they get to talk to the people who wrote the books that influenced them, work with the people who wrote the tools they would like to use, and collaborate on projects that propel change in the real world. As an Application Developer at ThoughtWorks, you’ll get to: Think through hard problems in a consultancy environment, and work with amazing people to make the solutions a reality Work in a dynamic, collaborative, non-hierarchical environment where your talent is valued over your job title or years of experience Build custom software using the latest technologies and tools Craft your own career path You'll be responsible for: Creating complex, enterprise-transforming applications on diverse, high energy teams Using the latest tools and techniques Hands-on coding, usually in a pair programming environment Working in highly collaborative teams and building quality code Working in lots of different domains and client environments Understanding the business domain deeply and working closely with business stakeholders Ideally, you should you have: Minimum 3 years of development and delivery experience with Java / C# / Ruby / Python / Scala / Clojure / Django. Hands-on experience in analysis, design, coding, and implementation of complex, custom-built applications Great Object-oriented skills, including strong design patterns knowledge Familiarity with relational databases, preferably MySQL, NoSQL, Oracle, PostgreSQL or SQL Server Experience working with, or an interest in Agile Methodologies, such as Extreme Programming (XP) and Scrum Knowledge of software best practices, like Test-Driven Development (TDD) and Continuous Integration (CI) Strong communication and client-facing skills with the ability to work in a consulting environment is essential Desire to contribute to the wider technical community through collaboration, coaching, and mentoring of other technologists Senior developers (8+ years) are expected to be the Architect for relatively smaller enterprise level projects and for larger projects, you are expected to work closely with the fellow architects to come up with the architecture and take it further.
Software Engineer At Tekion, we believe that business applications can be as simple and cool as the best social or consumer app and yet be powerful to seamlessly and efficiently run large global businesses. To bring our belief to reality, we are building the world’s best business application on the cloud, using the entire spectrum of technology available on big-data, machine learning, human computer interaction (voice, touch, vision, sensors and IoT) and we are inventing technology to overcome anything that is limiting our goals. The end goal is, to enable and empower our customers and bring maximum value to their core business. Initially focused on transforming one industry vertical, for which we already have potential customers signed up. We have two of world’s largest companies in this industry investing in us along with some world renowned investors/ venture capitalists. We have a proven world class technology team that is solving complex problems, to make life super simple for our customers. If you love solving complex problems, and want to be part of the team that is shaping up the next generation business applications on the cloud and while having fun doing it, then we are looking for you. We are looking for talented Senior Software Engineers who want to be part of building the next generation business applications on the cloud. The Senior Software Engineer will use his/her passion and expertise for creating world class products. He/she will collaborate with the product and engineering teams and development managers to create the business application of the future. Key Responsibilities • Write high quality code and taking responsibility of their task • Own critical components and be responsible for the sub systems that you work on from design, code, testing, integration, deployment, enhancements etc. • Build large scalable applications for cloud deployment • Mentoring the team Minimum Qualification • Bachelor in Computer science or relevant fields • 5+ years of experience • Experience in React JS • Strong understating of XML, JSON, DOM and w3c standards • Experience in developing UI for ecommerce/social media/collaboration platform/etc., • Strong sense of ownership • Attitude for getting things done Preferred Qualification • • Bachelor/Masters in Computer Science • Experience in developing any enterprise systems like eCommerce, manufacturing, supply chain, etc., • Experience in web programming ( React JS) • Excellent understanding of performance and optimization techniques
US based Multinational Company Hands on Hadoop
Job Title: Distributed Systems Engineer - SDET Job Location: Pune, India Job Description: Are you looking to put your computer science skills to use? Are you looking to work for one of the hottest start-ups in Silicon Valley? Are you looking to define the next generation data management platform based on Apache Spark? Are you excited by the idea of being a Spark committer? If you answered yes to all of the questions above, we definitely want to talk to you. We are looking to add highly motivated engineers to work as a QE software engineer in our product development team in Pune. We work on cutting edge data management products that transform the way businesses operate. As a distributed systems engineer (if you are good) , you will get to work on defining key elements of our real time analytics platform, including 1. Distributed in memory data management 2. OLTP and OLAP querying in a single platform 3. Approximate Query Processing over large data sets 4. Online machine learning algorithms applied to streaming data sets 5. Streaming and continuous querying Requirements: 1. Experience in testing modern SQL, NewSQL products highly desirable 2. Experience with SQL language, JDBC, end to end testing of databases 3. Hands on Experience in writing SQL queries 4. Experience on database performance benchmarks like TPC-H, TPC-C and TPC-E a plus 5. Prior experience in benchmarking against Cassandra or MemSQL is a big plus 6. You should be able to program either in Java or have some exposure to functional programming in Scala 7. You should care about performance, and by that, we mean performance optimizations in a JVM 8. You should be self motivated and driven to succeed 9. If you are an open source committer on any project, especially an Apache project, you will fit right in 10. Experience working with Spark, SparkSQL, Spark Streaming is a BIG plus 11. Plans & authors Test plans and ensure testability is considered by development in all stages of the life cycle. 12. Plans, schedules and tracks the creations of Test plans / automation scripts using defined methodologies for manual and/or automated tests 13. Work as QE team member in troubleshooting, isolating, reproducing, tracking bugs and verifying fixes. 14. Analyze test results to ensure existing functionality and recommends corrective action. Documents test results, manages and maintains defect & test case databases to assist in process improvement and estimation of future releases. 15. Performs the assessment and planning of test efforts required for automation of new functions/features under development. Influences design changes to improve quality and feature testability. 16. If you have solved big complex problems, we want to talk to you 17. If you are a math geek, with a background in statistics, mathematics and you know what a linear regression is, this just might be the place for you 18. Exposure to stream data processing Storm, Samza is a plus Open source contributors: Send us your Github id Product: SnappyData is a new real-time analytics platform that combines probabilistic data structures, approximate query processing and in memory distributed data management to deliver powerful analytic querying and alerting capabilities on Apache Spark at a fraction of the cost of traditional big data analytics platforms. SnappyData fuses the Spark computational engine with a highly available, multi-tenanted in-memory database to execute OLAP and OLTP queries on streaming data. Further, SnappyData can store data in a variety of synopsis data structures to provide extremely fast responses on less resources. Finally, applications can either submit Spark programs or connect using JDBC/ODBC to run interactive or continuous SQL queries. Skills: 1. Distributed Systems, 2. Scala, 3. Apache Spark, 4. Spark SQL, 5. Spark Streaming, 6. Java, 7. YARN/Mesos What's in it for you: 1. Cutting edge work that is ultra meaningful 2. Colleagues who are the best of the best 3. Meaningful startup equity 4. Competitive base salary 5. Full benefits 6. Casual, Fun Office Company Overview: SnappyData is a Silicon Valley funded startup founded by engineers who pioneered the distributed in memory data business. It is advised by some of the legends of the computing industry who have been instrumental in creating multiple disruptions that have defined computing over the past 40 years. The engineering team that powers SnappyData built GemFire, one of the industry leading in memory data grids, which is used worldwide in mission critical applications ranging from finance to retail.
Looking for a technically sound and excellent trainer on big data technologies. Get an opportunity to become popular in the industry and get visibility. Host regular sessions on Big data related technologies and get paid to learn.
The candidate would need to work in a team developing clojure applications for Data related tasks.
Essential Job Functions: • Designs and develops digital solutions. • Develops software solutions. • Performs unit testing and test-driven development (TDD). • Troubleshoots and resolves technical problems. • Designs, maintains, and supports cloud infrastructure. • Collaborates with internal and external departments to accomplish various tasks and projects. Required Skills • Functional programming experience using Scala - required. • Backend development experience - preferred. • Search engine, indexing, and full text search experience - preferred. • Experience with and/or knowledge of AWS Cloud - required. Required Experience • Bachelor's degree in computer science, information technology, or related degree or equivalent experience and training. • (3 -5) years of Scala and Java programming experience - required. • (3 -5) years of integration and implementation experience
Scienaptic (www.scienaptic.com) is a new age technology and analytics company based in NY and Bangalore. Our mission is to infuse robust decision science into organizations. Our mantra to achieve our mission is to - reduce friction- among technology, processes and humans. We believe that good design thinking needs to permeate all aspects of our activities so that our customers get the best possible aesthetic and least frictious experience of our software and services. As a Prinicipal Software development Engineer you will be responsible for the development and augmentation of the software components which will be used to solve the analytics problems of large enterprises. These components are highly scalable, connect with multiple data sources and implement some of the complex algorithms We are funded by very senior and eminent business leaders in India and US. Our lead investor is Pramod Bhasin, who is known as a pioneer of ITES revolution. We have the working environment of a new age, cool startup. We are firm believers that the best talent grounds will be non-hierarchical in structure and spirit. We expect you to enjoy, thrive and empower others by progressing that culture. Requirements : - Candidate should have all round experience in developing and delivering large-scale business applications in scale-up systems as well as scale-out distributed systems. - Identify the appropriate software technology / tools based on the requirements and design elements contained in a system specification - Should implement complex algorithms in a scalable fashion. - Work closely with product and Analytic managers, user interaction designers, and other software engineers to develop new product offerings and improve existing ones. Qualifications/Experience : - Bachelor's or Master's degree in computer science or related field - 10 to 12 years of experience in core Java programming: JDK 1.7/JDK 1.8 and Familiarity with Big data systems like Hadoop and Spark is an added bonus - Familiarity with dependency injection, Concurrency, Guice/Spring - Familiarity with JDBC API / Databases like MySQL, Oracle, Hadoop - Knowledge of graph databases and traversal - Knowlede of SOLR/ElasticSearch, Cloud based deployment would be preferred
Greetings from Info Vision labs InfoVision was founded in 1995 by technology professionals with a vision to provide quality and cost-effective IT solutions worldwide. InfoVision is a global IT Services and Solutions company with primary focus on Strategic Resources, Enterprise Applications and Technology Solutions. Our core practice areas include Applications Security, Business Analytics, Visualization & Collaboration and Wireless & IP Communications. Our IT services cover the full range of needs of enterprises, from Staffing to Solutions. Over the past decade, our ability to serve our clients has steadily evolved. It now covers multiple industries, numerous geographies and flexible delivery models, as well as the state-of-the-art technologies. InfoVision opened its development and delivery center in 2014, at Pune and has been expanding with project engagements with clients based in US and India. We can offer the right individuals an industry leading package and fast career growth prospects. Please get to know about us at - http://infovisionlabs.com/about/
We at InfoVision Labs, are passionate about technology and what our clients would like to get accomplished. We continuously strive to understand business challenges, changing competitive landscape and how the cutting edge technology can help position our client to the forefront of the competition.We are a fun loving team of Usability Experts and Software Engineers, focused on Mobile Technology, Responsive Web Solutions and Cloud Based Solutions. Job Responsibilities: ◾Minimum 3 years of experience in Big Data skills required. ◾Complete life cycle experience with Big Data is highly preferred ◾Skills – Hadoop, Spark, “R”, Hive, Pig, H-Base and Scala ◾Excellent communication skills ◾Ability to work independently with no-supervision.
Check our JD: https://www.zeotap.com/job/senior-tech-lead-m-f-for-zeotap/oEQK2fw0
Our Ventures IT SOLUTIONS IT Consultation, Technology Development, Process Outsourcing. REAL ESTATE & BRAND through Land Weavers. EDUCATION SOLUTIONS Knowledge Icon, Dream tech labs and Animation bugs. INSURANCE TRAINING SOLUTIONS through Dreamweaversindia.com TELECOM SOLUTIONS through DreamTel DIGITAL MEDIA SOLUTIONS through Dream Media
Dreamweavers is a group of hard core professionals who dream high and achieve even higher.Our Ventures IT SOLUTIONS IT Consultation, Technology Development, Process Outsourcing. REAL ESTATE & BRAND through Land Weavers. EDUCATION SOLUTIONS Knowledge Icon, Dream tech labs and Animation bugs. INSURANCE TRAINING SOLUTIONS through Dreamweaversindia.com TELECOM SOLUTIONS through DreamTel DIGITAL MEDIA SOLUTIONS through Dream Media
Crest (Part of the Springer Nature group):-Headquartered in Pune, Crest is a Springer Nature company that delivers cutting edge IT and ITeS solutions to some of the biggest scientific content and database brands in the world. Our global teams work closely with our counterparts and clients in Europe, USA and New Zealand, leveraging the latest technology, marketing intelligence and subject matter expertise. With handpicked SME’s in a range of sciences and technology teams working on the latest ECM, Scala, SAP and MS Tech platforms, Crest not only develops quality STM content, but continuously enhances the channels though which they are delivered to the world. Crest is an ISO 9001 certified, driven by over 1000 professionals in Technology, Research& Analysis and Marketing & BPM. Specialties: 1. Technology 2. Research 3. Marketing Intelligence 4. Business Process Management
Develop analytic tools, working on BigData and Distributed systems. - Provide technical leadership on developing our core Analytic platform - Lead development efforts on product features using Scala/Java -Demonstrable excellence in innovation, problem solving, analytical skills, data structures and design patterns - Expert in building applications using Spark and Spark Streaming -Exposure to NoSQL: HBase/Cassandra, Hive and Pig -Latin, Mahout -Extensive experience with Hadoop and Machine learning algorithms
New team getting formed. Great learning opportunity to work on building team with Engineering focus. Multitude of opportunities across technology spectrum.
idea has a unique approach and it can be a big move in future on technology platform.
The Microsoft Office India team located in Hyderabad India (IDC) is building a set of next generation experiences. • Are you fascinated by having to build highly scalable APIs on a reliable stack that can fallback from persistent connections to SMS? • Can you build and run Services infrastructure that can scale to billions of transactions per day? • Can you build UI infrastructure that can be extended in infinite ways? We are part of the group whose mission is to reimagine productivity applications on mobile devices for emerging markets. A solid engineering culture, a fun set of people and solving tough problems are part of the deal and you will find it hard to say no to. If you have the technical chops, we would love to hear from you.