
Your Impact
This team is accountable for platform architecture evolution to meet the evolving needs of different business lines globally with rapid software deployment. As stewards of critical components in order execution and post-trade, the team is accountable for a high degree of software quality. The team consists of self-guided pragmatic individuals who are motivated to change the status quo in calculated ways.
As a member of the team, you will play an integral role on the trading floor. This is a dynamic, entrepreneurial team with a passion for technology and the markets, with individuals who thrive in a fast-paced changing environment. The team takes a data driven approach to decision making and you should be willing to participate in the full product lifecycle from requirements gathering, design, implementation, testing, support, and monitoring trading performance for systems and strategies used by our clients.
RESPONSIBILITIES AND QUALIFICATIONS
Responsibilities
+ Design, build and maintain low latency, high-performance electronic trading platform components, with a focus on market data, exchange and client connectivity and risk controls.
+ Deliver continuous optimization for latency, scale and resiliency improvements.
+ Participate in system builds for various markets globally, have curiosity and interest in understanding market microstructure details, work closely with engineering, sales and product teams globally for successful delivery of projects.
+ Basic Qualifications
+ Bachelors or Masters degree in computer science or engineering or equivalent experience
+ 3+ years of professional experience developing deterministic high performance, low latency systems in C++ (counted in single digit microseconds)
+ Prior experience with FIX and binary exchange connectivity and market data protocols preferred
+ Strong knowledge of object oriented programming, data structures, algorithms and design patterns
+ Critical path analysis, performance optimization and hardware acceleration.
+ Linux systems programming experience including memory management, concurrent programming infrastructure, and the networking stack
+ Experience developing distributed architecture systems and messaging protocols
+ Strong analytical and problem solving skills
+ Comfortable in a fast-paced environment, self-motivated, results driven and commercially focused
Preferred Qualifications
+ Software development in C++ in the context of high performance (low-latency, high-throughput) real-time computing.
+ Familiarity with STL and C++11/14 language extensions, Boost
+ Network programming (sockets, TCP/UDP/Multicast protocols)
+ Multi-threading, concurrent programming
+ Intimate knowledge of compilers, flow of data at hardware level (memory/caches, buses) + Some experience with FPGA or other hardware acceleration technologies
+ Experience processing large static datasets as well as high volume ticking data sources + Over 3 years' experience in Financial industry(Good to have)

About Defi Tech Pvt Ltd
About
Connect with the team
Similar jobs
Wissen Technology is now hiring for Java Developer with hands on experience in Core Java, multithreading, algorithms, and data structure.
We are solving complex technical problems in the financial industry and need talented software engineers to join our mission and be a part of a global software development team.
A brilliant opportunity to become a part of highly motivated and expert team which has made a mark as a high-end technical consulting.
Required Skills:
Experience - 3+.
Experience in Core Java 5.0 and above, CXF, Spring.
Extensive experience in developing enterprise-scale n-tier applications for financial domain. Should possess good architectural knowledge and be aware of enterprise application design patterns.
Should have the ability to analyze, design, develop and test complex, low-latency client-facing applications.
Good development experience with RDBMS, preferably Sybase database.
Good knowledge of multi-threading and high-volume server-side development.
Experience in sales and trading platforms in investment banking/capital markets.
Basic working knowledge of Unix/Linux.
Excellent problem solving and coding skills in Java.
Strong interpersonal, communication and analytical skills.
Should have the ability to express their design ideas and thoughts.


Software Developer
Roles and Responsibilities
- Apply knowledge set to fetch data from multiple online sources, cleanse it and build APIs on top of it
- Develop a deep understanding of our vast data sources on the web and know exactly how, when, and which data to scrap, parse and store
- We're looking for people who will naturally take ownership of data products and who can bring a project all the way from a fast prototype to production.
Desired Candidate Profile
- At Least 1-2 years of experience
- Strong coding experience in Python (knowledge of Javascripts is a plus)
- Strong knowledge of scraping frameworks in Python (Request, Beautiful Soup)
- Experience with SQL and NoSQL databases
- Knowledge in Version Control tools like GIT.
- Good understanding and hands-on with scheduling and managing tasks with cron.
Nice to have:
- Experience of having worked with elastic search
- Experience with multi-processing, multi-threading, and AWS/Azure is a plus
- Experience with web crawling is a plus
- Deploy server/related components to staging, live environments.
- Experience with cloud environments like AWS,etc as well as cloud solutions like Docker,Lambda, etc
- Experience in DevOps and related practices to improve development lifecycle, continuous delivery with high quality is an advantage.
- Compile and analyze data, processes, and codes to troubleshoot problems and identify areas for improvement
Experience: 2 to 8 Years
Job Description
- Technical Skills requirement : JAVA, Multithreading, OOPS, Data Structure, Karaf.
- Total experience required should be around 2 to 8 years.
Job Description :
- Strong development skills in Java JDK 1.7 or above.
- Knowledge of Java 8 features and Multithreading is a must-have.
- Should have a strong acumen in Data Structures, Algorithms, problem-solving and LogicalAnalytical skills.
- Thorough understanding of OOPS concepts, Design principles and implementation of different type of Design patterns.
- Sound understanding of concepts like Exceptional handling, SerializationDeserialization and Immutability concepts, etc.
- Experience with Multithreading, Concurrent Package and Concurrent APIs Basic understanding of Java Memory Management (JMM) including garbage collections concepts.
- Experience in RDBMS or NO SQL databases and writing SQL queries (Joins, group by, aggregate functions, etc.)


- Expertise in software design and development.
- Proficiency with at least one Object Oriented language(e. g. Java)
- Experience with building high-performance, highly available and scalable distributed systems
- Experience with API Design, ability to architect and implement an intuitive customer and third-party integration story.




Be Part Of Building The Future
Dremio is the Data Lake Engine company. Our mission is to reshape the world of analytics to deliver on the promise of data with a fundamentally new architecture, purpose-built for the exploding trend towards cloud data lake storage such as AWS S3 and Microsoft ADLS. We dramatically reduce and even eliminate the need for the complex and expensive workarounds that have been in use for decades, such as data warehouses (whether on-premise or cloud-native), structural data prep, ETL, cubes, and extracts. We do this by enabling lightning-fast queries directly against data lake storage, combined with full self-service for data users and full governance and control for IT. The results for enterprises are extremely compelling: 100X faster time to insight; 10X greater efficiency; zero data copies; and game-changing simplicity. And equally compelling is the market opportunity for Dremio, as we are well on our way to disrupting a $25BN+ market.
About the Role
The Dremio India team owns the DataLake Engine along with Cloud Infrastructure and services that power it. With focus on next generation data analytics supporting modern table formats like Iceberg, Deltalake, and open source initiatives such as Apache Arrow, Project Nessie and hybrid-cloud infrastructure, this team provides various opportunities to learn, deliver, and grow in career. We are looking for innovative minds with experience in leading and building high quality distributed systems at massive scale and solving complex problems.
Responsibilities & ownership
- Lead, build, deliver and ensure customer success of next-generation features related to scalability, reliability, robustness, usability, security, and performance of the product.
- Work on distributed systems for data processing with efficient protocols and communication, locking and consensus, schedulers, resource management, low latency access to distributed storage, auto scaling, and self healing.
- Understand and reason about concurrency and parallelization to deliver scalability and performance in a multithreaded and distributed environment.
- Lead the team to solve complex and unknown problems
- Solve technical problems and customer issues with technical expertise
- Design and deliver architectures that run optimally on public clouds like GCP, AWS, and Azure
- Mentor other team members for high quality and design
- Collaborate with Product Management to deliver on customer requirements and innovation
- Collaborate with Support and field teams to ensure that customers are successful with Dremio
Requirements
- B.S./M.S/Equivalent in Computer Science or a related technical field or equivalent experience
- Fluency in Java/C++ with 8+ years of experience developing production-level software
- Strong foundation in data structures, algorithms, multi-threaded and asynchronous programming models, and their use in developing distributed and scalable systems
- 5+ years experience in developing complex and scalable distributed systems and delivering, deploying, and managing microservices successfully
- Hands-on experience in query processing or optimization, distributed systems, concurrency control, data replication, code generation, networking, and storage systems
- Passion for quality, zero downtime upgrades, availability, resiliency, and uptime of the platform
- Passion for learning and delivering using latest technologies
- Ability to solve ambiguous, unexplored, and cross-team problems effectively
- Hands on experience of working projects on AWS, Azure, and Google Cloud Platform
- Experience with containers and Kubernetes for orchestration and container management in private and public clouds (AWS, Azure, and Google Cloud)
- Understanding of distributed file systems such as S3, ADLS, or HDFS
- Excellent communication skills and affinity for collaboration and teamwork
- Ability to work individually and collaboratively with other team members
- Ability to scope and plan solution for big problems and mentors others on the same
- Interested and motivated to be part of a fast-moving startup with a fun and accomplished team





