About QuantX Technologies pvt Ltd
About
Connect with the team
Similar jobs
Responsibilities:
- Design and Develop large scale sub-systems
- To periodically explore latest technologies (esp Open Source) and prototype sub-systems
- Be a part of the team that develops the next-gen Targeting platform
- Build components to make the customer data platform more efficient and scalable
Qualifications:
- 0-2 years of relevant experience with Java, Algorithms, Data Structures, & Optimizations in addition to Coding.
- Education: B.E/B-Tech/M-Tech/M.S in Computer Science or IT from premier institutes
Skill Set:
- Good Aptitude/Analytical skills (emphasis will be on Algorithms, Data Structures,& Optimizations in addition to Coding)
- Good knowledge of Databases - SQL, NoSQL
- Knowledge of Unit Testing a plus
Soft Skills:
- Has an appreciation of technology and its ability to create value in the marketing domain
- Excellent written and verbal communication skills
- Active & contributing team member
- Strong work ethic with demonstrated ability to meet and exceed commitments
- Others: Experience of having worked in a start-up is a plus
A small description about the Company.
It is an Account Engagement Platform which helps B2B organizations to achieve predictable revenue growth by putting the power of AI, big data, and machine learning behind every member of the revenue team.
Looking for PYTHON DEVELOPER.
Your tool box :
Exp: 4+ Strong C/C++/C#/.net Core development skills with a good understanding of object-
oriented and multi-threaded design.
Strong background of computer science fundamentals (data structures, algorithms)
Passionate to learn and explore new technologies and demonstrates good analysis and
problem-solving skills.
Good written and verbal communication skills, should be a quick learner and a team player.
B.E. /B-Tech (CS/IT) • MCA/M.E./M-Tech (CS/IT)
Big Plus [ Mastering one or more of below ]:
Network troubleshooting skills [ TCP/IP, SSH, HTTPS ]
Hands on Kubernetes and Cloud environment
Hands On experience on UNIX or LINUX operating systems.
Strong with VoIP technologies [ SIP and RTP ]
Good understating of SOA architecture
As a Go lang Developer, you will be working on Blockchain Layer 1.
● Advanced proficiency in Golang programming language, and good skills in languages such as C++,
Java, Solidity and Python (good to have).
● Extensive experience in back-end development, algorithms, and data structures.
● Extensive Knowledge of blockchain structure, protocol development or Smart Contract
● Writing clean, efficient, and reusable code that follows best practices and coding standards.
● knowledge of distributed and decentralized network protocols
● Knowledge of various decentralized ledger technologies and protocols
● Understanding of gossip protocol and consensus protocol
● Knowledge of best practices in data protection.
● Collaborating with managers to determine technology needs and envisaged functionalities.
● Creating application features and interfaces by using programming languages and writing multithreaded codes.
● Applying the latest cryptology techniques to protect digital transaction data against cyberattacks and information hacks.
● Maintaining client and server-side applications.
Have you streamed a program on Disney+, watched your favorite binge-worthy series on Peacock or cheered your favorite team on during the World Cup from one of the 20 top streaming platforms around the globe? If the answer is yes, you’ve already benefitted from Conviva technology, helping the world’s leading streaming publishers deliver exceptional streaming experiences and grow their businesses.
Conviva is the only global streaming analytics platform for big data that collects, standardizes, and puts trillions of cross-screen, streaming data points in context, in real time. The Conviva platform provides comprehensive, continuous, census-level measurement through real-time, server side sessionization at unprecedented scale. If this sounds important, it is! We measure a global footprint of more than 500 million unique viewers in 180 countries watching 220 billion streams per year across 3 billion applications streaming on devices. With Conviva, customers get a unique level of actionability and scale from continuous streaming measurement insights and benchmarking across every stream, every screen, every second.
As Conviva is expanding, we are building products providing deep insights into end user experience for our customers.
Platform and TLB Team
The vision for the TLB team is to build data processing software that works on terabytes of streaming data in real time. Engineer the next-gen Spark-like system for in-memory computation of large time-series dataset’s – both Spark-like backend infra and library based programming model. Build horizontally and vertically scalable system that analyses trillions of events per day within sub second latencies. Utilize the latest and greatest of big data technologies to build solutions for use-cases across multiple verticals. Lead technology innovation and advancement that will have big business impact for years to come. Be part of a worldwide team building software using the latest technologies and the best of software development tools and processes.
What You’ll Do
This is an individual contributor position. Expectations will be on the below lines:
- Design, build and maintain the stream processing, and time-series analysis system which is at the heart of Conviva's products
- Responsible for the architecture of the Conviva platform
- Build features, enhancements, new services, and bug fixing in Scala and Java on a Jenkins-based pipeline to be deployed as Docker containers on Kubernetes
- Own the entire lifecycle of your microservice including early specs, design, technology choice, development, unit-testing, integration-testing, documentation, deployment, troubleshooting, enhancements etc.
- Lead a team to develop a feature or parts of the product
- Adhere to the Agile model of software development to plan, estimate, and ship per business priority
What you need to succeed
- 9+ years of work experience in software development of data processing products.
- Engineering degree in software or equivalent from a premier institute.
- Excellent knowledge of fundamentals of Computer Science like algorithms and data structures. Hands-on with functional programming and know-how of its concepts
- Excellent programming and debugging skills on the JVM. Proficient in writing code in Scala/Java/Rust/Haskell/Erlang that is reliable, maintainable, secure, and performant
- Experience with big data technologies like Spark, Flink, Kafka, Druid, HDFS, etc.
- Deep understanding of distributed systems concepts and scalability challenges including multi-threading, concurrency, sharding, partitioning, etc.
- Experience/knowledge of Akka/Lagom framework and/or stream processing technologies like RxJava or Project Reactor will be a big plus. Knowledge of design patterns like event-streaming, CQRS and DDD to build large microservice architectures will be a big plus
- Excellent communication skills. Willingness to work under pressure. Hunger to learn and succeed. Comfortable with ambiguity. Comfortable with complexity
Underpinning the Conviva platform is a rich history of innovation. More than 60 patents represent award-winning technologies and standards, including first-of-its kind-innovations like time-state analytics and AI-automated data modeling, that surfaces actionable insights. By understanding real-world human experiences and having the ability to act within seconds of observation, our customers can solve business-critical issues and focus on growing their businesses ahead of the competition. Examples of the brands Conviva has helped fuel streaming growth for include DAZN, Disney+, HBO, Hulu, NBCUniversal, Paramount+, Peacock, Sky, Sling TV, Univision, and Warner Bros Discovery.
Privately held, Conviva is headquartered in Silicon Valley, California with offices and people around the globe. For more information, visit us at www.conviva.com. Join us to help extend our leadership position in big data streaming analytics to new audiences and markets!
Hiring Java Developers across hierarchical level for Datametica Solutions Pvt. Ltd.
Designation: Developer / Lead / Architect - JAVA
Experience - 4+ Years
Work Location - Pune
Responsibilities:
- Own, drive and evolve product systems/subsystems
- Develop, architect highly scalable, highly available, reliable, secure and fault-tolerant systems with minimal guidance
- Suggest new architectural elements to improve the existing architecture
- Design and implement low latency RESTful services; Define API contracts between services; Version APIs and make them backward compatible
- Translate business requirements into scalable and extensible design
- Create platforms, reusable libraries and utilities wherever applicable
- Continuously refactor applications to ensure high-quality design
- Choose the right technology stack for the product systems/subsystems
- Write high-quality code that are modular, functional and testable; Establish the best coding practices
- Formally mentor junior engineers on design, coding and troubleshooting
- Plan projects using agile methodologies and ensure timely delivery
- Work with automation engineers to automate end-end flows and non-functional requirements
- Troubleshoot issues effectively in a distributed architecture
- Communicate, collaborate and work effectively in a global environment
- Operationalize releases by partnering with Tech operations on capacity planning and operability of the product.
Skills Required:
- Proficient in JVM based language(like Java, Groovy), J2EE technology stack
- Expertise in API design and development
- Experience in dealing with a large dataset
- Strong in Data Structure, collections, algorithms, multithreading, etc
- Practicing the coding standards (clean code, design patterns, etc)
- Very strong object-oriented design skills, awareness of design patterns and architectural patterns
- Performance tuning and Troubleshooting memory issues, GC tuning, resource leaks, etc.
- Strong problem-solving skills, algorithmic skills and data structures
- Experience in agile methodologies like Scrum
- Good understanding of branching, build, deployment, continuous integration methodologies
- Experience in leading a team (min 5)and mentoring engineers
- Attitude to getting Stuff Done!
- Ability to make decisions independently.
Interested candidate must shoot up their resume immediately.
Note: Candidates who are immediately available or have a notice period of 30 to 45 Days are highly preferred.
The Solar Labs was founded by IIT alumni in 2017 to accelerate solar adoption in the world. Our products empower the solar industry to help it succeed. We develop software that helps solar installers and developers in designing more optimized solar PV systems, increase energy yield per panel installed, reduce cost of installations and create quotations and reports for clients within 20 minutes. The software has been used to estimate 1200 MW+ of solar capacity across India and serves some of the largest companies in the world including Tata Power, Adani Solar, Renew Power and hundreds of MSMEs.
When we succeed, the solar industry wins, and the world wins.
About the Product :
It's a 3d simulation software, to replicate rooftops/commercial sites, place solar panels and generate the estimation of solar energy.
Responsibilities :
- Develop cloud-based Python Django software products
- Working closely with UX and Front-end Developers
- Participating in architectural, design and product discussions Designing and creating RESTful APIs for internal and partner consumption
- Working in an agile environment with an excellent team of engineers
Skills :
- Solid database skills in a relational database (i.e. PostgresSQL, MySQL, etc.) Knowledge of how to build and use with RESTful APIs
- Strong knowledge of version control (i.e. git, svn, etc.)
- Experience deploying Python applications into production
- Amazon Web Services (AWS) or Google cloud infrastructure knowledge is a plus
Job Location : Pune
Experience: 4 to 6 years
Responsibility
Design and development of Linux Device driver, BSP & Kernel modules
1. Strong in system C programming on Linux platform.
2. Strong experience in Linux Kernel and device driver development
3. Experience in Kernel porting and migration to different platforms and kernel versions.
4. Experience in BSP and Boot loaders
5. Device drivers experience preferably for following devices: UART, I2C, I2S, SPI, GPIO, PCIe, MMC, USB etc.,
6. Good knowledge on platforms and peripheral devices
7. Knowledge on processor like IMax Good to have
8. PowerPC experience
9. Python programming
Package : Commensurate to relevant experience ( CTC Range: 6 to 10 lacs/annum)
Immediate Joining !!