
- Ensure that all activities are carried out in a timely and cost-effective manner.
- Application management procedures, systems, and best practices.
- Materials are purchased, inventory is planned, and warehouse efficiency is monitored.
- Assist the organization's processes in remaining lawful.
- Develop strategic and operational goals.
- Examine financial information and apply it to boost profits.
- Budgets and predictions must be managed.
- Perform quality checks and keep track of production KPIs.
- Train and oversee employees
- Look for methods to improve customer service quality.

About Madman Technologies
About
Connect with the team
Similar jobs

We are looking for skilled Data Engineer to design, build, and maintain robust data pipelines and infrastructure. You will play a pivotal role in optimizing data flow, ensuring scalability, and enabling seamless access to structured/unstructured data across the organization. This role requires technical expertise in Python, SQL, ETL/ELT frameworks, and cloud data warehouses, along with strong collaboration skills to partner with cross-functional teams.
Company: BigThinkCode Technologies
URL:
Location: Chennai (Work from office / Hybrid)
Experience: 4 - 6 years
Key Responsibilities:
- Design, develop, and maintain scalable ETL/ELT pipelines to process structured and unstructured data.
- Optimize and manage SQL queries for performance and efficiency in large-scale datasets.
- Experience working with data warehouse solutions (e.g., Redshift, BigQuery, Snowflake) for analytics and reporting.
- Collaborate with data scientists, analysts, and business stakeholders to translate requirements into technical solutions.
- Experience in Implementing solutions for streaming data (e.g., Apache Kafka, AWS Kinesis) is preferred but not mandatory.
- Ensure data quality, governance, and security across pipelines and storage systems.
- Document architectures, processes, and workflows for clarity and reproducibility.
Required Technical Skills:
- Proficiency in Python for scripting, automation, and pipeline development.
- Expertise in SQL (complex queries, optimization, and database design).
- Hands-on experience with ETL/ELT tools (e.g., Apache Airflow, dbt, AWS Glue).
- Experience working with structured data (RDBMS) and unstructured data (JSON, Parquet, Avro).
- Familiarity with cloud-based data warehouses (Redshift, BigQuery, Snowflake).
- Knowledge of version control systems (e.g., Git) and CI/CD practices.
Preferred Qualifications:
- Experience with streaming data technologies (e.g., Kafka, Kinesis, Spark Streaming).
- Exposure to cloud platforms (AWS, GCP, Azure) and their data services.
- Understanding of data modelling (dimensional, star schema) and optimization techniques.
Soft Skills:
- Team player with a collaborative mindset and ability to mentor junior engineers.
- Strong stakeholder management skills to align technical solutions with business goals.
- Excellent communication skills to explain technical concepts to non-technical audiences.
- Proactive problem-solving and adaptability in fast-paced environments.
If interested, apply / reply by sharing your updated profile to connect and discuss.
Regards
● 1-2 years in outbound focused sales position desired; customer-oriented background
required (sales, support, customer service) in B2B preferably
● Bachelor’s degree in business, marketing, or related field
● Excellent communication and interpersonal skills
● The ability to generate leads independently.
● Willingness to learn and apply the fundamentals of the MEDDIC Sales process approach
● Great to have-Artificial Intelligence, Machine Learning and/or Computer Vision
● Knowledge of the AI industry and emerging technologies

Responsibilities:
You will get a chance to create products from scratch. While you will get the advantage of the scale of the organization, you are expected to come up with creative solutions to challenging problems.
On a typical day, you'd work with highly skilled engineers to solve complex problems. This is an early-stage initiative. Your ability to translate business requirements, and develop and demonstrate quick prototypes or concepts with other technology teams will be of great value.
You will learn and work on a variety of languages such as C/C++, python, and Linux as well as work on BLE, MEMS, biometric sensors, and the latest wireless technologies.
Requirements:
6+ years of Embedded firmware development experience in C/C++
BLE/GPS/GSM/RTOS stack expertise
Hands-on experience with Lab equipment (VNA/RSA/MSO etc).
Testing environment setup using automation scripts and networking equipment, practices for the full software development life cycle, including coding standards, code reviews, source control management, continuous
Familiar with Wireless/IoT network protocols and standards.
Experience with microcontrollers, sensors, and serial communication.
Preferred experience with wearOS/TizenSuperior presentation and communication skills, both written and verbal
Bachelor/Masters's degree in electrical/electronic/communications engineering, information technology, physics, or a related field from
Tier 1 Tier 2 Engineering colleges only (IITs/NITs/IIITs/BITS etc. )
Result-oriented and ready to take ownership. Exhibit strong team
HR Executive responsibilities include:
· Designing compensation and benefits packages
· Implementing performance review procedures (e.g. quarterly/annual and 360° evaluations)
· Developing fair HR policies and ensuring employees understand and comply with them
Job brief
We are looking for an HR Executive to manage our company’s recruiting, learning and development and employee performance programs.
HR Executive responsibilities include creating referral programs, updating HR policies and overseeing our hiring processes. To be successful in this role, you should have an extensive background in Human Resources departments and thorough knowledge of labor legislation.
Ultimately, you will make strategic decisions for our company so that we hire, develop and retain qualified employees.
Responsibilities
· Design compensation and benefits packages
· Implement performance review procedures (e.g. quarterly/annual and 360° evaluations)
· Develop fair HR policies and ensure employees understand and comply with them
· Implement effective sourcing, screening and interviewing techniques
· Assess training needs and coordinate learning and development initiatives for all employees
· Monitor HR department’s budget
· Act as the point of contact regarding labor legislation issues
· Manage employees’ grievances
· Create and run referral bonus programs
· Review current HR technology and recommend more effective software (including HRIS and ATS)
· Measure employee retention and turnover rates
· Oversee daily operations of the HR department
Requirements and skills
· Proven work experience as an HR Executive.
· Familiarity with Human Resources Management Systems and Applicant Tracking Systems
· Experience with full-cycle recruiting
· Good knowledge of labor legislation (particularly employment contracts, employee leaves and insurance)
· Demonstrable leadership abilities
· Solid communication skills

About us
SteelEye is the only regulatory compliance technology and data analytics firm that offers transaction reporting, record keeping, trade reconstruction, best execution and data insight in one comprehensive solution. The firm’s scalable secure data storage platform offers encryption at rest and in flight and best-in-class analytics to help financial firms meet regulatory obligations and gain competitive advantage.
The company has a highly experienced management team and a strong board, who have decades of technology and management experience and worked in senior positions at many leading international financial businesses. We are a young company that shares a commitment to learning, being smart, working hard and being honest in all we do and striving to do that better each day. We value all our colleagues equally and everyone should feel able to speak up, propose an idea, point out a mistake and feel safe, happy and be themselves at work.
Being part of a start-up can be equally exciting as it is challenging. You will be part of the SteelEye team not just because of your talent but also because of your entrepreneurial flare which we thrive on at SteelEye. This means we want you to be curious, contribute, ask questions and share ideas. We encourage you to get involved in helping shape our business. What you'll do
What you will do?
- Deliver plugins for our python based ETL pipelines.
- Deliver python services for provisioning and managing cloud infrastructure.
- Design, Develop, Unit Test, and Support code in production.
- Deal with challenges associated with large volumes of data.
- Manage expectations with internal stakeholders and context switch between multiple deliverables as priorities change.
- Thrive in an environment that uses AWS and Elasticsearch extensively.
- Keep abreast of technology and contribute to the evolution of the product.
- Champion best practices and provide mentorship.
What we're looking for
- Python 3.
- Python libraries used for data (such as pandas, numpy).
- AWS.
- Elasticsearch.
- Performance tuning.
- Object Oriented Design and Modelling.
- Delivering complex software, ideally in a FinTech setting.
- CI/CD tools.
- Knowledge of design patterns.
- Sharp analytical and problem-solving skills.
- Strong sense of ownership.
- Demonstrable desire to learn and grow.
- Excellent written and oral communication skills.
- Mature collaboration and mentoring abilities.
What will you get?
- This is an individual contributor role. So, if you are someone who loves to code and solve complex problems and build amazing products and not worry about anything else, this is the role for you.
- You will have the chance to learn from the best in the business who have worked across the world and are technology geeks.
- Company that always appreciates ownership and initiative. If you are someone who is full of ideas, this role is for you.
- Strong communication skills (written and verbal)
- Responsive, reliable and results oriented with the ability to execute on aggressive plans
- A background in software development, with experience of working in an agile product software development environment
- An understanding of modern deployment tools (Git, Bitbucket, Jenkins, etc.), workflow tools (Jira, Confluence) and practices (Agile (SCRUM), DevOps, etc.)
- Expert level experience with AWS tools, technologies and APIs associated with it - IAM, Cloud-Formation, Cloud Watch, AMIs, SNS, EC2, EBS, EFS, S3, RDS, VPC, ELB, IAM, Route 53, Security Groups, Lambda, VPC etc.
- Hands on experience with Kubernetes (EKS preferred)
- Strong DevOps skills across CI/CD and configuration management using Jenkins, Ansible, Terraform, Docker.
- Experience provisioning and spinning up AWS Clusters using Terraform, Helm, Helm Charts
- Ability to work across multiple projects simultaneously
- Ability to manage and work with teams and customers across the globe


problems to impact a billion people.
● You will need to choose which architectures suit future requirements and mold the
relevant modules accordingly.
● Ownership of product/business requirements.
● Craft the opportunity for reusable frameworks, toolkits that would be used across iOS
Teams.
● Work closely with the relevant platform stakeholders and Collaborate with multiple
product teams.
● Ownership of the app’s performance, health matrix, and build an app for the next billion
people.
● Reviews cross-team work critically and ensure it’s appropriately broken down and prioritized,
and well understood by all involved teams.
Technical DNAs Expected
● Proficiency in Swift and Objective C, a novice in backend development.
● Solid fundamentals of data structures, Algorithms, System Design.
● Good understanding of internal and external libraries and write code with useful
abstraction and separation of concerns.
● Concurrency and multithreading are friends.
● Prefer to reduce the 3rd party dependencies and opt only if it’s essential.
● GateKeeper for Master and maintain strategies to code integration.
● Pivot for a good degree of predictability (estimation, planning) in deliverables.
● Proficient in CI/CD Pipelines, Fastlane tools.
Document Credit: iOS Team
● 3 Plus Full-Time, Professional Software Development Experience.
Software Developer Engineering II, iOS Development.
● Inclination towards reactive programming.
Good To Have
● Contribution towards the iOS Community
● Exposure to Swift Package Manager and Swift UI
1. Self starter and is familiar how to work with an organized chaos since that is how most agencies function
2. Well aware of social media metrices, analytics and platform
3. Understands how to co-ordinate with clients as well as with teams internally
4. Mobile, we are working remotely but we always on out toes and we may start hybrid office in two-three months
5. Understands and is a digital native and trainable for offline work
6. Expected to run job lists and networks of all current jobs and projects
7. Expected to be punctual and accountable for his/her work and the work of the team
8. Expected to understand both b2b and b2c categories
9. Expected to understand and be able to craft creative briefs
10. Expected to understand how to function under timelines and run timelines
11. Experience in beauty category would be an added advantage
12. expected to have the ability to multi task
13. Added advantage will be shoot exposure

Desirable: Capable of designing any responsive UI; Rational problem-solving approach
Skills Required
- MERN Stack
- Laravel + Vue.js
Requirements
- Realtime application using Pusher/similar services
- Must be familiar with using mostly used design patterns
- Good knowledge of inheritance, use of a single source of truth
- Familiarity with databases MySQL, Redis, MongoDB web servers like Apache and Nginx
- Must have experience of GitHub or GitLab
- Ability to write testable codes and knowledge of CI/CD is a plus
- Excellent problem solving and debugging Skills
- Excellent communication and teamwork skills
- Working Knowledge with Cloud will be an added advantage
- Quick learner
- Great attention to detail
- Organizational skills
- Should be cooperative with colleagues as well as keen to learn new things.
Responsibilities:
- Work with development teams and product managers to implement software solutions
- Design client-server architecture
- Build the front-end of applications through appealing visual design
- Develop and manage well-functioning databases and applications
- Write effective APIs
- Test software to ensure responsiveness and efficiency
- Must have efficient troubleshooting, debug skills
- Create security and data protection settings
- Build features and applications with a mobile responsive design
- Should have the ability to train and mentor fellow associates and trainees
- Developing telemetry software to connect Junos devices to the cloud
- Fast prototyping and laying the SW foundation for product solutions
- Moving prototype solutions to a production cloud multitenant SaaS solution
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources
- Build analytics tools that utilize the data pipeline to provide significant insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with partners including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
- Work with data and analytics specialists to strive for greater functionality in our data systems.
Qualification and Desired Experiences
- Master in Computer Science, Electrical Engineering, Statistics, Applied Math or equivalent fields with strong mathematical background
- 5+ years experiences building data pipelines for data science-driven solutions
- Strong hands-on coding skills (preferably in Python) processing large-scale data set and developing machine learning model
- Familiar with one or more machine learning or statistical modeling tools such as Numpy, ScikitLearn, MLlib, Tensorflow
- Good team worker with excellent interpersonal skills written, verbal and presentation
- Create and maintain optimal data pipeline architecture,
- Assemble large, sophisticated data sets that meet functional / non-functional business requirements.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Experience with AWS, S3, Flink, Spark, Kafka, Elastic Search
- Previous work in a start-up environment
- 3+ years experiences building data pipelines for data science-driven solutions
- Master in Computer Science, Electrical Engineering, Statistics, Applied Math or equivalent fields with strong mathematical background
- We are looking for a candidate with 9+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools:
- Experience with big data tools: Hadoop, Spark, Kafka, etc.
- Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
- Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
- Experience with AWS cloud services: EC2, EMR, RDS, Redshift
- Experience with stream-processing systems: Storm, Spark-Streaming, etc.
- Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
- Strong hands-on coding skills (preferably in Python) processing large-scale data set and developing machine learning model
- Familiar with one or more machine learning or statistical modeling tools such as Numpy, ScikitLearn, MLlib, Tensorflow
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
- Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and find opportunities for improvement.
- Strong analytic skills related to working with unstructured datasets.
- Build processes supporting data transformation, data structures, metadata, dependency and workload management.
- A successful history of manipulating, processing and extracting value from large disconnected datasets.
- Proven understanding of message queuing, stream processing, and highly scalable ‘big data’ data stores.
- Strong project management and interpersonal skills.
- Experience supporting and working with multi-functional teams in a multidimensional environment.

