
š Weāre Hiring: Crypto Trader ā Trade Fast, Think Smart, Scale Big
š Location: Permanent Remote
š Type: Full-Time
š Experience: 1ā3 Years (or proven self-taught pro)
About the Role
Weāre building a proprietary crypto trading desk for sharp, disciplined, and driven traders. If you live for the markets, thrive in volatility, and can turn chaos into opportunity, this role gives you the chance to trade serious capital with the right tools and mentorship.
What Youāll Do
-Live Crypto Trading: Trade BTC, ETH & altcoins on spot and derivatives (Binance, Bybit & more).
-Market Analysis: Study price action, volume, funding rates, liquidation clusters & macro catalysts.
-Tools & Platforms: Use TradingView, TensorCharts, Coinglass & more for real-time insights.
-Strategy Execution: Run scalps, breakouts, trend-following, mean reversion & news-based trades.
-Collaboration: Work with quant & trading teams to refine or automate setups.
-Risk Management: Follow stop-loss, leverage discipline & drawdown limitsāstrategy over emotion.
-Journaling & Reviews: Maintain detailed trade logs & engage in weekly P&L review sessions.
Youāre a Fit If Youā¦
ā Already trade crypto (personal account or at a desk).
ā Understand liquidity zones, market structure & volatility.
ā Thrive in 24/7, high-paced trading environments.
ā Are flexible with market hours & global schedules.
ā Love exploring new trading tools, data, and platforms.
What We Offer
ā” Pro Setup: Institutional-grade terminals, real-time tools & blazing-fast connectivity.
š Unlimited Payouts: Performance bonuses with no capāyou win, you earn.
šÆ Mentorship: Learn directly from senior crypto & derivatives traders.
š Growth Culture: Weekly strategy sessions, alpha sharing & market explorations.
š¤ Flat Team Vibes: No ego, no politicsājust market focus.

Similar jobs
We are looking for a Java Backend Developer to join our team. You will be responsible for developing and maintaining the server-side logic and databases of our applications, ensuring they run smoothly and efficiently.
Responsibilities:
- Write and maintain server-side code using Java.
- Develop and integrate APIs to support frontend functionality.
- Work with databases to store, retrieve, and manipulate data.
- Troubleshoot, debug, and optimize backend performance.
- Collaborate with frontend developers to integrate backend and frontend systems.
- Write unit tests to ensure code quality.
Required Skills:
- Strong experience with Java.
- Familiarity with Spring Boot or other Java frameworks.
- Knowledge of relational databases (e.g., MySQL, PostgreSQL).
- Understanding of REST APIs.
- Experience with version control systems (e.g., Git).
- Basic knowledge of unit testing (e.g., JUnit).
Preferred Skills:
- Familiarity with Microservices and cloud platforms.
- Experience with Docker and containerization.
Job Description -Ā
- Manual testing
- should be familiar Java programming languages and should have good problem-solving skills in that programming language.
- BDD Cucumber framework
- should be familiar in automation testing tool selenium.
- should be familiar with API testing manual and automation (widely used tools postman and rest assured library)
- Good Communication
Roles and Responsibilities
- Develop and execute automation scripts for newly developed features. Modify existing framework to augment feature testing needs.
- Continuously monitor automation test results and ensure no random test failures in automation.
- Identify the cross-browser / platform impacts during the planning phase and ensure high test coverage.
- Identify quality assurance process gaps and suggest actions for improvement.
- Ensures Zero defect leakages.
- Contribute Non-functional Testing aspects - Performance, Security, and Translation.
- Work closely with the Development team and ensure code is delivered with high code coverage.
- Contribute to continuous Integration and Continuous Deployment.
Role:Ā Azure Fabric Data Engineer
Experience:Ā 5ā10 Years
Location:Ā Pune/Bangalore
Employment Type:Ā Full-Time
About the Role
We are looking for an experiencedĀ Azure Data EngineerĀ with strong expertise inĀ Microsoft FabricĀ andĀ Power BIĀ to build scalable data pipelines, Lakehouse architectures, and enterprise analytics solutions on the Azure cloud.
Key Responsibilities
- Design & buildĀ data pipelinesĀ using Microsoft Fabric (Pipelines, Dataflows Gen2, Notebooks).
- Develop and optimizeĀ Lakehouse / Data Lake / Delta LakeĀ architectures.
- Build ETL/ELT workflows using Fabric, Azure Data Factory, or Synapse.
- Create and optimizeĀ Power BI datasets, data models, and DAX calculations.
- ImplementĀ semantic models, incremental refresh, and Direct Lake/DirectQuery.
- Work with Azure services: ADLS Gen2, Azure SQL, Synapse, Event Hub, Functions, Databricks.
- Build dimensional models (Star/Snowflake) and support BI teams.
- Ensure data governance & security using Purview, RBAC, and AAD.
Required Skills
- Strong hands-on experience withĀ Microsoft Fabric (Lakehouse, Pipelines, Dataflows, Notebooks).
- Expertise inĀ Power BIĀ (DAX, modeling, Dataflows, optimized datasets).
- Deep knowledge of Azure Data Engineering stack (ADF, ADLS, Synapse, SQL).
- Strong SQL, Python/PySpark skills.
- Experience in Delta Lake, Medallion architecture, and data quality frameworks.
Nice to Have
- Azure Certifications (DP-203, PL-300, Fabric Analytics Engineer).
- Experience with CI/CD (Azure DevOps/GitHub).
- Databricks experience (preferred).
Note: One Technical round is mandatory to be taken F2F from either Pune or Bangalore office
Looking immediate joiners.
Experience in JD Edwards sales and distribution, advance pricing, p2p, warehouse management.
- Conducting skills gap analyses between industry expectations and academic curriculum.
- Taking inputs from corporates and designing learning solutions for students pursuing the courses from the brand to achieve learning goals.
- Developing learning objectives and ensuring content matches those objectives
- Creating engaging high level and low-level instructional design for students to best grasp content.
- Creating relevant graphic and online content in collaboration with the graphics and the technology team.
- Coordinating with business development teams in solutioning for corporates
- Coordinating with academic heads and faculties as required to best implement standardised curriculum.
- Conducting Train the Trainer for the faculties to ensure specific learning outcomes through each course.
- Designing assessment methodologies to make these outcomes measurable.
- Maintaining Content coordination across all the campuses to ensure standardizations.
- Ensuring that the brand is protected by adhering to brand guidelines and company regulations.
Ā
Ā
What you need to have:- 5+ years of relevant work experience.
- Passionate towards education, proactive and willing to take ownership.
- Can thrive in high work pressure environment.
- Ability to identify trends and react quickly to keep the curriculum and content relevant.
- Excellent communication and organizational skills
- Storyboarding, eLearning, Instructional design & content writing
- Positive attitude and a desire to produce measurable results
- Proven hands-on Software Development experience
- Proven working experience in Java development
- Hands on experience in designing and developing applications using Java EE platforms
- Object Oriented analysis and design using common design patterns.
- Profound insight of Java and JEE internals (Classloading, Memory Management, Transaction management etc)
- Excellent knowledge of Relational Databases, SQL and ORM technologies (JPA2, Hibernate)
- Experience in the Spring Framework
- Experience as a Sun Certified Java Developer
- Experience in developing web applications using at least one popular web framework (JSF, Wicket, GWT, Spring MVC)
- Experience with test-driven development
MATLAB developer in the application team is responsible for software development of the backend interfaces and business logic in machine applications customized for different product lines.
Ā
Responsibilities
- Core Matlab large applications development
- Graphical applications development in Matlab including Object Oriented Programming
- Experience in M-Scripting must with GUI design
- Ability to independently architect and deliver new, large-scale applications in Matlab
- Matlab M-scripting & troubleshooting
- HPC (High performance computing), Parallel processing
- JIRA, GIT/GITHUB
- Ability to communicate and collaborative working
Ā
Qualifications
- Bachelor's degree in Computer Science or relevant field
- 5+ years of experience working with MATLAB or relevant experiences
- Good knowledge of development and continuous integration process and tools Studio, Git, Jira, Confluence, Jenkins, TeamCity, etc
- Good knowledge in agile software development (Scrum, Kanban)

- Establish and maintain a trusted advisor relationship within the companyās IT, Commercial Digital Solutions, Functions, and Businesses you interact with
- Establish and maintain close working relationships with teams responsible for delivering solutions to the companyās businesses and functions
- Perform key management and thought leadership in the areas of advanced data techniques, including data modeling, data access, data integration, data visualization, big data solutions, text mining, data discovery, statistical methods, and database design
- Work with business partners to define ways to leverage data to develop platforms and solutions to drive business growth
- Engage collaboratively with project teams to support project objectives through the application of sound data architectural principles; support the project with knowledge of existing data assets and provide guidance on reusable data structures
- Share knowledge of external and internal data capabilities and trends, provide leadership, and facilitate the evaluation of vendors and products
- Utilize advanced data analysis, including statistical analysis and data mining techniques
- Collaborate with others to set an enterprise data vision with solid recommendations, and work to gain business and IT consensus
Basic Qualifications
- Overall 15+ years of IT Environment Experience.
- Solid background as a data architect/cloud architect with a minimum of 5 years as a core architect
- Architecting experience in the cloud data warehouse solutions (Snowflake preferred, big query/redshift/synapse analytics-nice to have)
- Strong architecture and design skills using Azure Services( Like ADF/Data Flows/Evengrids/IOTHUB/EvenHub/ADLS Gen2 /Serverless Azure Functions/Logic Apps/Azure Analysis Services Cube Design patterns/Azure SQL Db)
- Working Knowledge on Lambda/Kappa frameworks within the data Lake designs& architecture solutions
- Deep understanding of the DevOps/DataOps patterns
- Architecting the semantic models
- Data modeling experience with Data vault principles
- Cloud-native Batch & Realtime ELT/ETL patterns
- Familiarity with Lucid Chart/Visio
- Logging & monitoring pattern designs in the data lake /data warehouse context
- Meta data-driven design patterns * solutioning expertise
- Data Catalog integration experience in the data lake designs
- 3+ years of experience partnering with business managers to develop technical strategies and architectures to support their objectives
- 2+ years of hands-on experience with analytics deployment in the cloud (prefer Azure but AWS knowledge is acceptable)
- 5+ years of delivering analytics in modern data architecture (Hadoop, Massively Parallel Processing Database Platforms, and Semantic Modeling)
- Demonstrable knowledge of ETL and ELT patterns and when to use either one; experience selecting among different tools that could be leveraged to accomplish this (Talend, Informatica, Azure Data Factory, SSIS, SAP Data Services)
- Demonstrable knowledge of and experience with different scripting languages (python, javascript, PIG, or object-oriented programming like Java or . NET)
Preferred Qualifications
- Bachelorās degree in Computer Science, MIS, related field, or equivalent experience
- Experience working with solutions delivery teams using Agile/Scrum or similar methodologies
- 2+ years of experience designing solutions leveraging Microsoft Cortana Intelligence Suite of Products [Azure SQL, Azure SQL DW, Cosmos DB, HDInsight, DataBricks]
- Experience with enterprise systems, like CRM, ERP, Field Services Management, Supply Chain solutions, HR systems
- Ability to work independently, establishing strategic objectives, project plans, and milestones
- Exceptional written, verbal & presentation skills
Ā
Ā







