Software Engineer At Tekion, we believe that business applications can be as simple and cool as the best social or consumer app and yet be powerful to seamlessly and efficiently run large global businesses. To bring our belief to reality, we are building the world’s best business application on the cloud, using the entire spectrum of technology available on big-data, machine learning, human computer interaction (voice, touch, vision, sensors and IoT) and we are inventing technology to overcome anything that is limiting our goals. The end goal is, to enable and empower our customers and bring maximum value to their core business. Initially focused on transforming one industry vertical, for which we already have potential customers signed up. We have two of world’s largest companies in this industry investing in us along with some world renowned investors/ venture capitalists. We have a proven world class technology team that is solving complex problems, to make life super simple for our customers. If you love solving complex problems, and want to be part of the team that is shaping up the next generation business applications on the cloud and while having fun doing it, then we are looking for you. We are looking for talented Senior Software Engineers who want to be part of building the next generation business applications on the cloud. The Senior Software Engineer will use his/her passion and expertise for creating world class products. He/she will collaborate with the product and engineering teams and development managers to create the business application of the future. Key Responsibilities • Write high quality code and taking responsibility of their task • Own critical components and be responsible for the sub systems that you work on from design, code, testing, integration, deployment, enhancements etc. • Build large scalable applications for cloud deployment • Mentoring the team Minimum Qualification • Bachelor in Computer science or relevant fields • 5+ years of experience • Experience in React JS • Strong understating of XML, JSON, DOM and w3c standards • Experience in developing UI for ecommerce/social media/collaboration platform/etc., • Strong sense of ownership • Attitude for getting things done Preferred Qualification • • Bachelor/Masters in Computer Science • Experience in developing any enterprise systems like eCommerce, manufacturing, supply chain, etc., • Experience in web programming ( React JS) • Excellent understanding of performance and optimization techniques
What is Artivatic : Artivatic is AI enterprise tech platform built on genomic science, psychology and neuroscience capabilities to automate the decision making, prediction, personalization & recommendation in real time. Artivatic is building its own propitiatory algorithms 'connected-data-genome-mapping' & 'cross-sector-connected-intelligence'. It is available in form of API, SDK & SaaS platform for enterprises & developers to build intelligent systems & solutions. Artivatic utilizes neuroscience-based proprietary artificial intelligence systems to solve business problems and speed up processes that are usually conducted manually. Job description : We are looking for someone with an experience in Scala/Java and Play framework. Restful API development experience is mandatory. Must have experience in Database management systems. Preference will be given to profiles with experience in Scala Bonus points if they have prior experience in Cassandra Database. Location: Koramangala, Bangalore Roles and Responsibilities : - Building server-side logic that powers our APIs, in effect deploying machine learning models in production system that can scale to billions of API calls - Scaling and performance tuning of database to handle billions of API calls and thousands of concurrent requests - Collaborate with data science team to build effective solutions for data collection, pre-processing and integrating machine learning into the workflow - Collaborate, provide technical guidance, and engage in design and code review for other team members. - Excellent Scala, Cassandra, architect, API, software, Python, Java programming and software design skills, including debugging, performance analysis and test design - Proficiency with at least one Scala, GoLang, Python micro-frameworks like Flask, Tornado, Play, Spring etc. with experience in building REST APIs - Experience or understanding in building web crawlers, data fetching bots etc. - Experience with design and optimisation of Neo4j, Cassandra, NoSQL databases, PostgreSQL, Redis, Elastic Search - Familiarity with one of the cloud service providers, AWS or Google Compute Engine - Computer Science degree with 5+ years of backend programming experience
US based Multinational Company Hands on Hadoop
Equal Experts makes simple solutions to big business problems. We provide tailored, end-to-end services in software development and delivery – from user research and design, to technical architecture, development and QA, all the way to devops, continuous delivery, hosting and support. With offices in the UK, US, Portugal, India and Canada, our network of over 700 experienced software consultants – a blend of permanent employees and associates – has created software for a wide range of public and private sector clients. These include organisations as diverse as HMRC, the Home Office, O2, Camelot and major institutions in the publishing, financial and retail sectors. Continuing growth saw our total sales reaching £42 million in 2015/16. Everyone at Equal Experts is committed to using technology and modern agile practices to deliver measurable business value. Our people typically have at least 10 years’ experience in delivering valuable, working software, and this focus on experience sets us apart – it’s what allows us to develop high-quality software faster, and for lower cost. Recent updateSee all See all recent updates Image accompanying recent update 10 years together: Associate model - Equal Experts equalexperts.com 11 Likes 5d Company details Website http://www.equalexperts.com Headquarters London Year founded 2007 Company type Privately Held Company size 501-1,000 employees Specialties Application Development, Offshore & Distributed delivery, System Integration, Iterative Delivery, Digital Transformation, User Centred Design, Continuous Delivery, E-Commerce, DevOps, Big Data, Mobile (Web and Native), Open Source, Java, Scala, NoSQL, .Net, REST, and Cloud
Job Title: Distributed Systems Engineer - SDET Job Location: Pune, India Job Description: Are you looking to put your computer science skills to use? Are you looking to work for one of the hottest start-ups in Silicon Valley? Are you looking to define the next generation data management platform based on Apache Spark? Are you excited by the idea of being a Spark committer? If you answered yes to all of the questions above, we definitely want to talk to you. We are looking to add highly motivated engineers to work as a QE software engineer in our product development team in Pune. We work on cutting edge data management products that transform the way businesses operate. As a distributed systems engineer (if you are good) , you will get to work on defining key elements of our real time analytics platform, including 1. Distributed in memory data management 2. OLTP and OLAP querying in a single platform 3. Approximate Query Processing over large data sets 4. Online machine learning algorithms applied to streaming data sets 5. Streaming and continuous querying Requirements: 1. Experience in testing modern SQL, NewSQL products highly desirable 2. Experience with SQL language, JDBC, end to end testing of databases 3. Hands on Experience in writing SQL queries 4. Experience on database performance benchmarks like TPC-H, TPC-C and TPC-E a plus 5. Prior experience in benchmarking against Cassandra or MemSQL is a big plus 6. You should be able to program either in Java or have some exposure to functional programming in Scala 7. You should care about performance, and by that, we mean performance optimizations in a JVM 8. You should be self motivated and driven to succeed 9. If you are an open source committer on any project, especially an Apache project, you will fit right in 10. Experience working with Spark, SparkSQL, Spark Streaming is a BIG plus 11. Plans & authors Test plans and ensure testability is considered by development in all stages of the life cycle. 12. Plans, schedules and tracks the creations of Test plans / automation scripts using defined methodologies for manual and/or automated tests 13. Work as QE team member in troubleshooting, isolating, reproducing, tracking bugs and verifying fixes. 14. Analyze test results to ensure existing functionality and recommends corrective action. Documents test results, manages and maintains defect & test case databases to assist in process improvement and estimation of future releases. 15. Performs the assessment and planning of test efforts required for automation of new functions/features under development. Influences design changes to improve quality and feature testability. 16. If you have solved big complex problems, we want to talk to you 17. If you are a math geek, with a background in statistics, mathematics and you know what a linear regression is, this just might be the place for you 18. Exposure to stream data processing Storm, Samza is a plus Open source contributors: Send us your Github id Product: SnappyData is a new real-time analytics platform that combines probabilistic data structures, approximate query processing and in memory distributed data management to deliver powerful analytic querying and alerting capabilities on Apache Spark at a fraction of the cost of traditional big data analytics platforms. SnappyData fuses the Spark computational engine with a highly available, multi-tenanted in-memory database to execute OLAP and OLTP queries on streaming data. Further, SnappyData can store data in a variety of synopsis data structures to provide extremely fast responses on less resources. Finally, applications can either submit Spark programs or connect using JDBC/ODBC to run interactive or continuous SQL queries. Skills: 1. Distributed Systems, 2. Scala, 3. Apache Spark, 4. Spark SQL, 5. Spark Streaming, 6. Java, 7. YARN/Mesos What's in it for you: 1. Cutting edge work that is ultra meaningful 2. Colleagues who are the best of the best 3. Meaningful startup equity 4. Competitive base salary 5. Full benefits 6. Casual, Fun Office Company Overview: SnappyData is a Silicon Valley funded startup founded by engineers who pioneered the distributed in memory data business. It is advised by some of the legends of the computing industry who have been instrumental in creating multiple disruptions that have defined computing over the past 40 years. The engineering team that powers SnappyData built GemFire, one of the industry leading in memory data grids, which is used worldwide in mission critical applications ranging from finance to retail.
Looking for a technically sound and excellent trainer on big data technologies. Get an opportunity to become popular in the industry and get visibility. Host regular sessions on Big data related technologies and get paid to learn.
Experience : Minimum of 3 years of relevant development experience Qualification : BS in Computer Science or equivalent Skills Required: • Server side developers with good server side development experience in Java AND/OR Python • Exposure to Data Platforms (Cassandra, Spark, Kafka) will be a plus • Interested in Machine Learning will be a plus • Good to great problem solving and communication skill • Ability to deliver in an extremely fast paced development environment • Ability to handle ambiguity • Should be a good team player Job Responsibilities : • Learn the technology area where you are going to work • Develop bug free, unit tested and well documented code as per requirements • Stringently adhere to delivery timelines • Provide mentoring support to Software Engineer AND/ OR Associate Software Engineers • Any other as specified by the reporting authority
The candidate would need to work in a team developing clojure applications for Data related tasks.
Essential Job Functions: • Designs and develops digital solutions. • Develops software solutions. • Performs unit testing and test-driven development (TDD). • Troubleshoots and resolves technical problems. • Designs, maintains, and supports cloud infrastructure. • Collaborates with internal and external departments to accomplish various tasks and projects. Required Skills • Functional programming experience using Scala - required. • Backend development experience - preferred. • Search engine, indexing, and full text search experience - preferred. • Experience with and/or knowledge of AWS Cloud - required. Required Experience • Bachelor's degree in computer science, information technology, or related degree or equivalent experience and training. • (3 -5) years of Scala and Java programming experience - required. • (3 -5) years of integration and implementation experience
Scienaptic (www.scienaptic.com) is a new age technology and analytics company based in NY and Bangalore. Our mission is to infuse robust decision science into organizations. Our mantra to achieve our mission is to - reduce friction- among technology, processes and humans. We believe that good design thinking needs to permeate all aspects of our activities so that our customers get the best possible aesthetic and least frictious experience of our software and services. As a Prinicipal Software development Engineer you will be responsible for the development and augmentation of the software components which will be used to solve the analytics problems of large enterprises. These components are highly scalable, connect with multiple data sources and implement some of the complex algorithms We are funded by very senior and eminent business leaders in India and US. Our lead investor is Pramod Bhasin, who is known as a pioneer of ITES revolution. We have the working environment of a new age, cool startup. We are firm believers that the best talent grounds will be non-hierarchical in structure and spirit. We expect you to enjoy, thrive and empower others by progressing that culture. Requirements : - Candidate should have all round experience in developing and delivering large-scale business applications in scale-up systems as well as scale-out distributed systems. - Identify the appropriate software technology / tools based on the requirements and design elements contained in a system specification - Should implement complex algorithms in a scalable fashion. - Work closely with product and Analytic managers, user interaction designers, and other software engineers to develop new product offerings and improve existing ones. Qualifications/Experience : - Bachelor's or Master's degree in computer science or related field - 10 to 12 years of experience in core Java programming: JDK 1.7/JDK 1.8 and Familiarity with Big data systems like Hadoop and Spark is an added bonus - Familiarity with dependency injection, Concurrency, Guice/Spring - Familiarity with JDBC API / Databases like MySQL, Oracle, Hadoop - Knowledge of graph databases and traversal - Knowlede of SOLR/ElasticSearch, Cloud based deployment would be preferred
Greetings from Info Vision labs InfoVision was founded in 1995 by technology professionals with a vision to provide quality and cost-effective IT solutions worldwide. InfoVision is a global IT Services and Solutions company with primary focus on Strategic Resources, Enterprise Applications and Technology Solutions. Our core practice areas include Applications Security, Business Analytics, Visualization & Collaboration and Wireless & IP Communications. Our IT services cover the full range of needs of enterprises, from Staffing to Solutions. Over the past decade, our ability to serve our clients has steadily evolved. It now covers multiple industries, numerous geographies and flexible delivery models, as well as the state-of-the-art technologies. InfoVision opened its development and delivery center in 2014, at Pune and has been expanding with project engagements with clients based in US and India. We can offer the right individuals an industry leading package and fast career growth prospects. Please get to know about us at - http://infovisionlabs.com/about/
We at InfoVision Labs, are passionate about technology and what our clients would like to get accomplished. We continuously strive to understand business challenges, changing competitive landscape and how the cutting edge technology can help position our client to the forefront of the competition.We are a fun loving team of Usability Experts and Software Engineers, focused on Mobile Technology, Responsive Web Solutions and Cloud Based Solutions. Job Responsibilities: ◾Minimum 3 years of experience in Big Data skills required. ◾Complete life cycle experience with Big Data is highly preferred ◾Skills – Hadoop, Spark, “R”, Hive, Pig, H-Base and Scala ◾Excellent communication skills ◾Ability to work independently with no-supervision.
Check our JD: https://www.zeotap.com/job/senior-tech-lead-m-f-for-zeotap/oEQK2fw0
Expect the Best. At Target, we have a vision: to become the best - the best culture and brand, the best place for growth and the company with the best reputation. We offer an inclusive, collaborative and energetic work environment that rewards those who perform. We deliver engaging, innovative and on-trend experiences for our team members and our guests. We invest in our team members' futures by developing leaders and providing a breadth of opportunities for professional development. It takes the best to become the best, and we are committed to building a team that does the right thing for our guests, shareholders, team members and communities. Minneapolis-based Target Corporation serves guests at stores nationwide and at Target.com. Target is committed to providing a fun and convenient shopping experience with access to unique and highly differentiated products at affordable prices. Since 1946, the corporation has given 5 percent of its income through community grants and programs like Take Charge of Education®.
• High proficiency in Java and Scala. • Expert in handling gigabytes scale data. • Data modeling / Data warehousing / BI experience a must. • Experience working on open source tools.
Our Ventures IT SOLUTIONS IT Consultation, Technology Development, Process Outsourcing. REAL ESTATE & BRAND through Land Weavers. EDUCATION SOLUTIONS Knowledge Icon, Dream tech labs and Animation bugs. INSURANCE TRAINING SOLUTIONS through Dreamweaversindia.com TELECOM SOLUTIONS through DreamTel DIGITAL MEDIA SOLUTIONS through Dream Media
Dreamweavers is a group of hard core professionals who dream high and achieve even higher.Our Ventures IT SOLUTIONS IT Consultation, Technology Development, Process Outsourcing. REAL ESTATE & BRAND through Land Weavers. EDUCATION SOLUTIONS Knowledge Icon, Dream tech labs and Animation bugs. INSURANCE TRAINING SOLUTIONS through Dreamweaversindia.com TELECOM SOLUTIONS through DreamTel DIGITAL MEDIA SOLUTIONS through Dream Media
Crest (Part of the Springer Nature group):-Headquartered in Pune, Crest is a Springer Nature company that delivers cutting edge IT and ITeS solutions to some of the biggest scientific content and database brands in the world. Our global teams work closely with our counterparts and clients in Europe, USA and New Zealand, leveraging the latest technology, marketing intelligence and subject matter expertise. With handpicked SME’s in a range of sciences and technology teams working on the latest ECM, Scala, SAP and MS Tech platforms, Crest not only develops quality STM content, but continuously enhances the channels though which they are delivered to the world. Crest is an ISO 9001 certified, driven by over 1000 professionals in Technology, Research& Analysis and Marketing & BPM. Specialties: 1. Technology 2. Research 3. Marketing Intelligence 4. Business Process Management
Develop analytic tools, working on BigData and Distributed systems. - Provide technical leadership on developing our core Analytic platform - Lead development efforts on product features using Scala/Java -Demonstrable excellence in innovation, problem solving, analytical skills, data structures and design patterns - Expert in building applications using Spark and Spark Streaming -Exposure to NoSQL: HBase/Cassandra, Hive and Pig -Latin, Mahout -Extensive experience with Hadoop and Machine learning algorithms
New team getting formed. Great learning opportunity to work on building team with Engineering focus. Multitude of opportunities across technology spectrum.
idea has a unique approach and it can be a big move in future on technology platform.
The Microsoft Office India team located in Hyderabad India (IDC) is building a set of next generation experiences. • Are you fascinated by having to build highly scalable APIs on a reliable stack that can fallback from persistent connections to SMS? • Can you build and run Services infrastructure that can scale to billions of transactions per day? • Can you build UI infrastructure that can be extended in infinite ways? We are part of the group whose mission is to reimagine productivity applications on mobile devices for emerging markets. A solid engineering culture, a fun set of people and solving tough problems are part of the deal and you will find it hard to say no to. If you have the technical chops, we would love to hear from you.