
About Darwin Labs
About
Darwin Labs (http://www.darwinlabs.io/" target="_blank">www.darwinlabs.io)
We are Entrepreneurs and Darwinians who enjoy creating businesses.
We are building products on Bitcoin, Blockchain, Cryptocurrencies, Virtual Reality, Augmented Reality and Ad Tech.
We are technology partner for GBMiners (http://www.gbminers.com/" target="_blank">www.gbminers.com), which is second biggest Bitcoin mining pool outside china and currently operating at 5% of Network’s Hashing Power. The pool was launched on 31st August and since then has been the fastest growing bitcoin mining pool globally.
We run South East Asia’s First Blockchain Incubator, Satoshi Studios (http://www.satoshistudios.io/" target="_blank">www.satoshistudios.io). The incubator offers 50K USD in funding to blockchain startups from South East Asia in return for 8 -15% equity.
Incubator is backed by people like Roger Ver and will launch its first batch in second quarter this year
Connect with the team
Similar jobs
About Nirmitee.io
Nirmitee.io is a fast-growing product engineering and IT services company building world-class products across healthcare and fintech. We believe in engineering excellence, innovation, and long-term impact. As a Tech Lead, you’ll be at the core of this journey — driving execution, building scalable systems, and mentoring a strong team.
- What You’ll DoLead by Example:
Be a hands-on contributor — writing clean, scalable, and production-grade code.
Review pull requests, set coding standards, and push for technical excellence.
- Own Delivery:
Take end-to-end ownership of sprints, architecture, code quality, and deployment.
Collaborate with PMs and founders to scope, plan, and execute product features.
- Build & Scale Teams:
Mentor and coach engineers to grow technically and professionally.
Foster a culture of accountability, transparency, and continuous improvement.
- Drive Process & Best Practices:
Implement strong CI/CD pipelines, testing strategies, and release processes.
Ensure predictable and high-quality delivery across multiple projects.
- Architect & Innovate:
Make key technical decisions, evaluate new technologies, and design system architecture for scale and performance.
Help shape the engineering roadmap with future-proof solutions.
- What We’re Looking For10+ years of experience in software development with at least 2 years in a lead/mentorship role.
- Strong hands-on expertise in MERN Stack or Python/Django/Flask, or GoLang/Java/Rust.
- Experience with AWS / Azure / GCP, containerization (Docker/Kubernetes), and CI/CD pipelines.
- Deep understanding of system design, architecture patterns, and performance optimization.
- Proven track record of shipping products in fast-paced environments.
- Strong communication, leadership, and ownership mindset.
- Believes in delivering what is committed, has a show-up attitude
- Nice to HaveExposure to healthcare or fintech domains.
- Experience in building scalable SaaS platforms or open-source contributions.
- Knowledge of security, compliance, and observability best practices.
- Why Nirmitee.ioWork directly with the founding team and shape product direction.
- Flat hierarchy — your ideas will matter.
- Ownership, speed, and impact.
- Opportunity to build something historical.
- Research industry-related topics (combining online sources, interviews and studies)
- Write clear marketing copy to promote our products/services
- Prepare well-structured drafts using Content Management Systems
- Proofread and edit blog posts before publication
- Submit work to editors for input and approval
- Coordinate with marketing and design teams to illustrate articles
- Conduct simple keyword research and use SEO guidelines to increase web traffic
- Promote content on social media
- Identify customers’ needs and gaps in our content and recommend new topics
- Ensure all-around consistency (style, fonts, images and tone)
- Update website content as needed
Job Description
Job Responsibilities
-
Design and implement robust database solutions including
-
Security, backup and recovery
-
Performance, scalability, monitoring and tuning,
-
Data management and capacity planning,
-
Planning, and implementing failover between database instances.
-
-
Create data architecture strategies for each subject area of the enterprise data model.
-
Communicate plans, status and issues to higher management levels.
-
Collaborate with the business, architects and other IT organizations to plan a data strategy, sharing important information related to database concerns and constrains
-
Produce all project data architecture deliverables..
-
Create and maintain a corporate repository of all data architecture artifacts.
Skills Required:
-
Understanding of data analysis, business principles, and operations
-
Software architecture and design Network design and implementation
-
Data visualization, data migration and data modelling
-
Relational database management systems
-
DBMS software, including SQL Server
-
Database and cloud computing design, architectures and data lakes
-
Information management and data processing on multiple platforms
-
Agile methodologies and enterprise resource planning implementation
-
Demonstrate database technical functionality, such as performance tuning, backup and recovery, monitoring.
-
Excellent skills with advanced features such as database encryption, replication, partitioning, etc.
-
Strong problem solving, organizational and communication skill.
Role Objective:
Big Data Engineer will be responsible for expanding and optimizing our data and database architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building. The Data Engineer will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products
Roles & Responsibilities:
- Sound knowledge in Spark architecture and distributed computing and Spark streaming.
- Proficient in Spark – including RDD and Data frames core functions, troubleshooting and performance tuning.
- SFDC(Data modelling experience) would be given preference
- Good understanding in object-oriented concepts and hands on experience on Scala with excellent programming logic and technique.
- Good in functional programming and OOPS concept on Scala
- Good experience in SQL – should be able to write complex queries.
- Managing the team of Associates and Senior Associates and ensuring the utilization is maintained across the project.
- Able to mentor new members for onboarding to the project.
- Understand the client requirement and able to design, develop from scratch and deliver.
- AWS cloud experience would be preferable.
- Design, build and operationalize large scale enterprise data solutions and applications using one or more of AWS data and analytics services - DynamoDB, RedShift, Kinesis, Lambda, S3, etc. (preferred)
- Hands on experience utilizing AWS Management Tools (CloudWatch, CloudTrail) to proactively monitor large and complex deployments (preferred)
- Experience in analyzing, re-architecting, and re-platforming on-premises data warehouses to data platforms on AWS (preferred)
- Leading the client calls to flag off any delays, blockers, escalations and collate all the requirements.
- Managing project timing, client expectations and meeting deadlines.
- Should have played project and team management roles.
- Facilitate meetings within the team on regular basis.
- Understand business requirement and analyze different approaches and plan deliverables and milestones for the project.
- Optimization, maintenance, and support of pipelines.
- Strong analytical and logical skills.
- Ability to comfortably tackling new challenges and learn
Exciting Opportunity for Aspiring Video Editors! Join Marktecons for Video Editor Internship for a 1-Month Experience! 🌟🎥
Are you passionate about video editing and ready to kickstart your career in the world of visual storytelling?
We have an incredible opportunity for you!
Marktecons is thrilled to announce thr Video Editor Internship , offering a hands-on experience in the art of video editing.
This 1-month internship is designed to provide you with valuable skills, mentorship, and an opportunity to showcase your creativity.
About the work
1. Reviewing and selecting footage: You'll need to go through all the raw video footage to select the best takes to use in the final video.
2. Cutting and editing: You'll need to edit the video, using tools like trimming, cropping, and transitions to create a smooth and engaging final product.
3. Adding effects: You may need to add visual effects or filters to the video to enhance its overall quality or create a particular mood.
4. Incorporating sound: You may need to add music or sound effects to the video, as well as adjust volume levels and sync the sound with the visuals.
5. Collaborating with the production team: You'll need to work closely with producers, directors, and other team members to ensure that the final video meets their expectations and vision.
6. Managing project timelines: You'll need to be able to manage your time effectively, ensuring that projects are completed on time and within budget.
7. Staying up-to-date with industry trends: You'll need to stay up-to-date with the latest trends and technologies in the field to continue producing high-quality content.
Skills:
Proficiency in video editing software such as Adobe Premiere Pro, Adobe After Effects, Adobe Photoshop, Adobe Illustrator
Excellent understanding of post-production workflows, including color grading, audio mixing, and visual effects.
Ability to take files from Graphical softwares like Adobe Photoshop Adobe Illustrator and utilize them in the edit.
Strong attention to detail and the capacity to maintain consistency in visual style and adherence to brand guidelines.
Portfolio showcasing previous video editing work is highly desirable.
- Taking ownership of building specific components of CARPL
- Working with the product team to prioritize tasks within these components
- Working with customers to redefine and modify the platform based on user input (optional).
You are the ideal candidate for this role if you :
- Provable proficiency in Go programming
- Have expert-level command over Python, Django, Flask, RDMS, NoSQL, Git Test, distributed systems.
-Experience with working in cloud environments, agile development methodologies with Test Driven Development (TDD) would be an advantage.
- Are familiar with Docker, Kubernetes
- It would be awesome if you are also familiar with DICOM, DCM4CHEE, HL7
- Are familiar with Jira, Asana, and Slack
- Knowledge of Go templating, common frameworks, and tools
- Believe that the future of healthcare lies in the power of AI and analytics
- Thrive in a chaotic, fast-moving, and ambiguous work environment
About the Company
Blue Sky Analytics is a Climate Tech startup that combines the power of AI & Satellite data to aid in the creation of a global environmental data stack. Our funders include Beenext and Rainmatter. Over the next 12 months, we aim to expand to 10 environmental data-sets spanning water, land, heat, and more!
We are looking for a data scientist to join its growing team. This position will require you to think and act on the geospatial architecture and data needs (specifically geospatial data) of the company. This position is strategic and will also require you to collaborate closely with data engineers, data scientists, software developers and even colleagues from other business functions. Come save the planet with us!
Your Role
Manage: It goes without saying that you will be handling large amounts of image and location datasets. You will develop dataframes and automated pipelines of data from multiple sources. You are expected to know how to visualize them and use machine learning algorithms to be able to make predictions. You will be working across teams to get the job done.
Analyze: You will curate and analyze vast amounts of geospatial datasets like satellite imagery, elevation data, meteorological datasets, openstreetmaps, demographic data, socio-econometric data and topography to extract useful insights about the events happening on our planet.
Develop: You will be required to develop processes and tools to monitor and analyze data and its accuracy. You will develop innovative algorithms which will be useful in tracking global environmental problems like depleting water levels, illegal tree logging, and even tracking of oil-spills.
Demonstrate: A familiarity with working in geospatial libraries such as GDAL/Rasterio for reading/writing of data, and use of QGIS in making visualizations. This will also extend to using advanced statistical techniques and applying concepts like regression, properties of distribution, and conduct other statistical tests.
Produce: With all the hard work being put into data creation and management, it has to be used! You will be able to produce maps showing (but not limited to) spatial distribution of various kinds of data, including emission statistics and pollution hotspots. In addition, you will produce reports that contain maps, visualizations and other resources developed over the course of managing these datasets.
Requirements
These are must have skill-sets that we are looking for:
- Excellent coding skills in Python (including deep familiarity with NumPy, SciPy, pandas).
- Significant experience with git, GitHub, SQL, AWS (S3 and EC2).
- Worked on GIS and is familiar with geospatial libraries such as GDAL and rasterio to read/write the data, a GIS software such as QGIS for visualisation and query, and basic machine learning algorithms to make predictions.
- Demonstrable experience implementing efficient neural network models and deploying them in a production environment.
- Knowledge of advanced statistical techniques and concepts (regression, properties of distributions, statistical tests and proper usage, etc.) and experience with applications.
- Capable of writing clear and lucid reports and demystifying data for the rest of us.
- Be curious and care about the planet!
- Minimum 2 years of demonstrable industry experience working with large and noisy datasets.
Benefits
- Work from anywhere: Work by the beach or from the mountains.
- Open source at heart: We are building a community where you can use, contribute and collaborate on.
- Own a slice of the pie: Possibility of becoming an owner by investing in ESOPs.
- Flexible timings: Fit your work around your lifestyle.
- Comprehensive health cover: Health cover for you and your dependents to keep you tension free.
- Work Machine of choice: Buy a device and own it after completing a year at BSA.
- Quarterly Retreats: Yes there's work-but then there's all the non-work+fun aspect aka the retreat!
- Yearly vacations: Take time off to rest and get ready for the next big assignment by availing the paid leaves.








