About Sterco Digitex
Similar jobs
Preferred work start time is 8AM IST (Reason for that is Client location is Australia)
We are seeking a skilled C# Developer to join our team. The ideal candidate will have a strong background in C# programming and experience in developing SSRS reports. Familiarity with Azure Functions is a plus. This role requires a quick learner who can adapt to new technologies and challenges with ease.
Responsibilities:
- Develop and maintain applications using C#.
- Create and manage SSRS reports to support business needs.
- Optionally work with Azure Functions to build and deploy services.
- Participate in all phases of the software development lifecycle.
- Collaborate with cross-functional teams to define, design, and ship new features.
- Ensure the performance, quality, and responsiveness of applications.
- Identify and correct bottlenecks and fix bugs.
- Help maintain code quality, organization, and automatization.
Qualifications:
- Proven experience as a C# developer.
- Strong knowledge of the .NET framework.
- Proficient in creating and managing SSRS reports.
- Experience with Azure Functions is advantageous but not required.
- Excellent problem-solving skills and ability to work independently.
- Adept at learning new technologies quickly.
Additional Notes:
- The ideal candidate would be able to start their day at 8 AM IST to synchronize with the team in Perth, Australia. While this is highly preferred, it is not a strict requirement. Candidates should be prepared for early starts regardless to attend essential team meetings.
What We Offer:
- Competitive salary and benefits package.
- Opportunity to work on exciting projects with cutting-edge technologies.
- A supportive and collaborative work environment.
- Professional growth and development opportunities.
Job Type: Full-time
We are looking for an experienced Media and PR Specialist who can help us elevate our brand presence and manage our media relations. The ideal candidate should have a proven track record in the B2B events industry and possess a deep understanding of media strategy and public relations.
Responsibilities & Scope of Work:
1. Media Strategy: Develop and execute a comprehensive media strategy to enhance our brand visibility and reputation within the industry.
2. Press Releases: Draft compelling press releases and news articles to announce company milestones, event launches, and other newsworthy developments.
3. Media Relations: Build and maintain strong relationships with key media outlets, journalists, and industry influencers.
4. Content Creation: Create engaging and informative content for various media channels, including social media, newsletters, and the company blog.
5. Event Promotion: Collaborate with the marketing team to promote our events through media channels, securing coverage and partnerships.
6. Analytics: Monitor and report on media coverage and PR efforts, providing insights and recommendations for continuous improvement.
Requirements and Qualifications
- Bachelor's degree in Communications, Public Relations, or a related field.
- 2+ years of experience in media relations and PR, preferably in the B2B events industry.
- Proven success in securing media coverage and building media relationships.
- Exceptional written and verbal communication skills.
- Strong organizational and project management abilities.
- Proficiency in PR software and analytics tools.
- Creative thinking and the ability to generate innovative PR ideas.
Job Type:
1. Full time & On-site
2. 5 day work week
Location: Bangalore/Mangalore
Brief:
As a BI Developer at GradRight, you’ll be working with Tableau and supporting data sources to build reports for the requirements of various business teams.
Responsibilities:
- Translate business needs to technical specifications for reports and dashboards
- Design, build and deploy BI solutions
- Maintain and support data analytics platforms (e.g. Tableau, Mixpanel, Google Analytics, etc)
- Evaluate and improve existing BI systems
- Collaborate with teams to integrate systems
- Develop and execute database queries, conduct analysis and prepare data to be shared with respective stakeholders
- Create visualizations and reports for requested projects
- Develop and update technical documentation around reports
Requirements:
- At least 3 years of proven experience as a BI Developer
- Experience at a startup
- Background in data warehouse design (e.g. dimensional modeling) and data mining
- In-depth understanding of database management systems, online analytical processing (OLAP) and ETL (Extract, transform, load) framework
- Working knowledge of Tableau
- Knowledge of SQL queries and MongoDB
- Proven abilities to take initiative and be innovative
- Analytical mind with a problem-solving aptitude
Sr. Data Engineer
Company Profile:
Bigfoot Retail Solutions [Shiprocket] is a logistics platform which connects Indian eCommerce SMBs with logistics players to enable end to end solutions.
Our innovative data backed platform drives logistics efficiency, helps reduce cost, increases sales throughput by reducing RTO and improves post order customer engagement and experience.
Our vision is to power all logistics for the direct commerce market in India
including first mile, linehaul, last mile, warehousing, cross border and O2O.
Position: Sr.Data Engineer
Team : Business Intelligence
Location: New Delhi
Job Description:
We are looking for a savvy Data Engineer to join our growing team of analytics experts. The hire will be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. The Data Engineer will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. The right candidate will be excited by the prospect of optimizing or even re-designing our company’s data architecture to support our next generation of products and data initiatives.
Key Responsibilities:
- Create and maintain optimal data pipeline architecture.
- Assemble large, complex data sets that meet functional / non-functional business requirements.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Keep our data separated and secure across national boundaries through multiple data centres and AWS regions.
- Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
- Work with data and analytics experts to strive for greater functionality in our data systems.
Qualifications for Data Engineer
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
- Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- Strong analytic skills related to working with unstructured datasets.
- Build processes supporting data transformation, data structures, metadata, dependency and workload management.
- A successful history of manipulating, processing and extracting value from large disconnected datasets.
- Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
- Strong project management and organizational skills.
- Experience supporting and working with cross-functional teams in a dynamic environment.
- We are looking for a candidate with 5+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools:
- Experience with big data tools: Hadoop, Spark, Kafka, etc.
- Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
- Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
- Experience with AWS cloud services: EC2, EMR, RDS, Redshift
- Experience with stream-processing systems: Storm, Spark-Streaming, etc.
- Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
Handling job portals.
Posting job portal on (monster, Naukri, Linkedin, indeed, etc. )
Making a job description.
Screening candidates from job portals.
Updating daily work report.
Pitching candidates for available vacancies, scheduling interviews.
Taking follow-ups from time to time.
Coordinate with candidates at the end of joining.
Maintaining database of the candidates in excel sheet.
Maintaining the pipeline of candidates and convincing them.
- Excellent communication skills.
- They should be able to handle inbound calls
- They must have to provide training to customers.
- They need to provide Technical help IN software to customers.
- They must be able to solve problems to clients by calls via emails.
- Excellent understanding of excel, pages, word, number & Keynote.
The key aspects of this role include:
• Design, build, and maintain scalable applications using Python.
• Contribute to the entire implementation process including driving the definition of improvements
based on business need and architectural improvements.
• Act as a subject matter expert for Application Software developers and Engineers.
• Handle server-side code for a production platform and contribute to new features.
To be the right fit, you'll need:
• More than 4+ years of experience as a software developer in Python, with knowledge of at least one
Python web framework such as Django, Flask, etc.
• Good understanding of common design patterns and architecture principles to design reliable and
Scalable applications
• Strong communication skills
• Knowledge of databases line NoSQL or MongoDB
• Good to have AWS and Docker or Web services
• Basic understanding of front-end technologies, such as JavaScript, HTML5, and CSS3
Job Description :
Hiring for Staff Engineer (Back end) for a leading product based company at DLF IT Park, Chennai.
Skill Set :
- Strong Experience in any Programming language (Ruby, Go, Java, or other high-performance languages), Architecture, Design (HLD/LLD), Data structures, Algorithms, Hands-on Coding, Problem Solving, etc
- Experience in Web Technology is Must.
- Looking for candidates with good experience in product development.
- Candidates from product development companies will be preferred.
- Candidates willing to relocate/preferring Chennai can apply.
Responsibilities :
- Analyze and drive product requirements
- Architect and design product features for scale and maintainability
- Lead in the design, implementation, and deployment of successful systems and services
- Ensure the quality of architecture and design of systems
- Implement code with very high coverage of unit tests and component tests
- Perform design and code reviews
- Functionally decompose complex problems into simple, straight-forward solutions
- Fully and completely understand system interdependencies and limitations
- Possess expert knowledge in performance, security, scalability, architecture, and best practices
- Software development of high quality/availability core systems
- Cross-training peers and mentoring teammates
- Document HLD/LLD for easy knowledge sharing and future scaling
Must have :
- 8-12 years of experience designing, integrating and developing distributed applications in Ruby, Go, Java, or other high-performance languages
- Experience with cluster and container orchestration systems such as Docker, Mesos, Marathon, Salt or Kubernetes.
- Experience with Service design, systems engineering, API Design and versioning
- Understanding of Design Patterns, Serverless computing, cloud-first architecture, TDD, BDD, CI/CD, Integration Patterns
Good to have :
- Experience building distributed systems using Kafka. Strong grasp of fundamental concepts of Kafka, ZooKeeper and building producer and consumer applications using Kafka
- Familiarity writing and optimizing advanced SQL queries
- Good Linux/UNIX systems knowledge
- AWS compute and storage PaaS services. AWS certified solutions architect nice to have.
- Experience productionizing Machine Learning models
- Experience publishing technical papers in reputed conferences.