
TCAM (Telematics and Connectivity Antenna) Engineer
InfoGrowth is looking for a skilled TCAM Engineer with expertise in the Signals & Antenna Domain to join our innovative team. If you have a passion for automotive infotainment and antenna systems, and enjoy leading and solving complex connectivity challenges, this is the role for you!
Key Responsibilities:
- Leverage deep Antenna System knowledge specifically within the Automotive Infotainment Domain.
- Implement solutions for Telematics & Connectivity Antenna Module (TCAM) within automotive systems.
- Lead and collaborate on antenna system development, including Design, Simulation, Data Analysis, and Validation.
- Debug and resolve antenna-related issues, working hands-on with the team.
- Work closely with vendors to enhance the antenna fabrication process and explore innovative antenna materials to meet performance standards.
- Conduct antenna characterizations including input impedance, impedance matching, antenna isolation, gain, polarization, radiation patterns, and ECC.
- Utilize expertise in embedded C/C++, vehicle signals, AutoSAR, cybersecurity, Linux, and RTOS concepts.
- Prototype, tune, and measure antenna systems using VNA and anechoic chamber systems.
Key Skills:
- Proven expertise in Antenna Systems in the Automotive Infotainment Domain.
- Hands-on experience with embedded automotive connectivity processors (TCAM).
- Strong knowledge of vehicle signals, AutoSAR, and system architecture.
- In-depth understanding of antenna design, fabrication, and characterization.
- Experience in debugging, fixing antenna-related issues, and collaborating with vendors.
- Proficient in C/C++, Linux, RTOS, and cybersecurity concepts.
- Excellent communication skills with the ability to lead, inspire, and mentor team members.

Similar jobs

The Sr AWS/Azure/GCP Databricks Data Engineer at Koantek will use comprehensive
modern data engineering techniques and methods with Advanced Analytics to support
business decisions for our clients. Your goal is to support the use of data-driven insights
to help our clients achieve business outcomes and objectives. You can collect, aggregate, and analyze structured/unstructured data from multiple internal and external sources and
patterns, insights, and trends to decision-makers. You will help design and build data
pipelines, data streams, reporting tools, information dashboards, data service APIs, data
generators, and other end-user information portals and insight tools. You will be a critical
part of the data supply chain, ensuring that stakeholders can access and manipulate data
for routine and ad hoc analysis to drive business outcomes using Advanced Analytics. You are expected to function as a productive member of a team, working and
communicating proactively with engineering peers, technical lead, project managers, product owners, and resource managers. Requirements:
Strong experience as an AWS/Azure/GCP Data Engineer and must have
AWS/Azure/GCP Databricks experience. Expert proficiency in Spark Scala, Python, and spark
Must have data migration experience from on-prem to cloud
Hands-on experience in Kinesis to process & analyze Stream Data, Event/IoT Hubs, and Cosmos
In depth understanding of Azure/AWS/GCP cloud and Data lake and Analytics
solutions on Azure. Expert level hands-on development Design and Develop applications on Databricks. Extensive hands-on experience implementing data migration and data processing
using AWS/Azure/GCP services
In depth understanding of Spark Architecture including Spark Streaming, Spark Core, Spark SQL, Data Frames, RDD caching, Spark MLib
Hands-on experience with the Technology stack available in the industry for data
management, data ingestion, capture, processing, and curation: Kafka, StreamSets, Attunity, GoldenGate, Map Reduce, Hadoop, Hive, Hbase, Cassandra, Spark, Flume, Hive, Impala, etc
Hands-on knowledge of data frameworks, data lakes and open-source projects such
asApache Spark, MLflow, and Delta Lake
Good working knowledge of code versioning tools [such as Git, Bitbucket or SVN]
Hands-on experience in using Spark SQL with various data sources like JSON, Parquet and Key Value Pair
Experience preparing data for Data Science and Machine Learning with exposure to- model selection, model lifecycle, hyperparameter tuning, model serving, deep
learning, etc
Demonstrated experience preparing data, automating and building data pipelines for
AI Use Cases (text, voice, image, IoT data etc. ). Good to have programming language experience with. NET or Spark/Scala
Experience in creating tables, partitioning, bucketing, loading and aggregating data
using Spark Scala, Spark SQL/PySpark
Knowledge of AWS/Azure/GCP DevOps processes like CI/CD as well as Agile tools
and processes including Git, Jenkins, Jira, and Confluence
Working experience with Visual Studio, PowerShell Scripting, and ARM templates. Able to build ingestion to ADLS and enable BI layer for Analytics
Strong understanding of Data Modeling and defining conceptual logical and physical
data models. Big Data/analytics/information analysis/database management in the cloud
IoT/event-driven/microservices in the cloud- Experience with private and public cloud
architectures, pros/cons, and migration considerations. Ability to remain up to date with industry standards and technological advancements
that will enhance data quality and reliability to advance strategic initiatives
Working knowledge of RESTful APIs, OAuth2 authorization framework and security
best practices for API Gateways
Guide customers in transforming big data projects, including development and
deployment of big data and AI applications
Guide customers on Data engineering best practices, provide proof of concept, architect solutions and collaborate when needed
2+ years of hands-on experience designing and implementing multi-tenant solutions
using AWS/Azure/GCP Databricks for data governance, data pipelines for near real-
time data warehouse, and machine learning solutions. Over all 5+ years' experience in a software development, data engineering, or data
analytics field using Python, PySpark, Scala, Spark, Java, or equivalent technologies. hands-on expertise in Apache SparkTM (Scala or Python)
3+ years of experience working in query tuning, performance tuning, troubleshooting, and debugging Spark and other big data solutions. Bachelor's or Master's degree in Big Data, Computer Science, Engineering, Mathematics, or similar area of study or equivalent work experience
Ability to manage competing priorities in a fast-paced environment
Ability to resolve issues
Basic experience with or knowledge of agile methodologies
AWS Certified: Solutions Architect Professional
Databricks Certified Associate Developer for Apache Spark
Microsoft Certified: Azure Data Engineer Associate
GCP Certified: Professional Google Cloud Certified
Responsibilities:-
- Responsible for maintaining accurate financial records and transactions
- They prepare financial statements, including income statements and balance sheets
- Accountants analyze financial data to identify trends and provide insights for decision-making. They assist in budgeting, forecasting and monitoring actual performance against targets
- Tax compliance is a key responsibility, including preparing and filing tax returns. Accountants coordinate and facilitate internal or external audits
- They utilize accounting software and generate reports. Compliance with financial regulations and reporting to regulatory bodies is part of the accountant's role
- Assessing financial risks and proposing strategies for mitigation is important. Provide financial advice and guidance based on analysis and market trends
- Visit our client office and collect payments
Requirements:-
- A minimum of 1 year of experience in finance and accounting
- A bachelor's or Master's degree in a relevant Accounting field or a professional accounting qualification
- Proficient in Tally and Management Information Systems(MIS)
- Should have Excellent communication
- Strong knowledge of accounting principles, financial reporting standards, and tax regulations. Analytical skills to interpret financial data and identify patterns
- Effective communication skills for conveying financial information. Proficiency in accounting software and spreadsheet applications
- Problem solving abilities to identify and address financial issues. Technical proficiency in MS Office suite
- Male candidates apply

Skills Required:
- Good experience with programming language Python
- Strong experience in Docker.
- Good knowledge with any of the Cloud Platform like Azure.
- Must be comfortable working in a Linux environment.
- Must have exposure into IOT domain and its protocols ((Zigbee & BLE ,LoRa,Modbus)
- Must be a good team player.
- Strong Communication Skills

Oversee the sales process to attract new clients.
Work with senior team members to identify and manage risks.
Maintain fruitful relationships with clients and address their needs effectively.
Prepare and deliver pitches to potential investors.
Foster a collaborative environment within the organization.
Transfer call to the supervisor after initial explanation.
SKILLS:
Excellent spoken English.
Both Graduate/Undergraduate may apply.
Ability to develop good relationships with current and potential clients.
Excellent communication skills.
Knowledge of productivity tools and software.
Ready to work in Nightshift.
SEO Executive
Job Types: Full-time, Regular / Permanent
Experience: minimum 6 months in SEO
Location: WFO company’s location i.e. Dehradun, Uttarakhand
Responsibilities
· Track, report, and analyze website analytics
· Optimize copy and landing pages for search engine marketing
· Perform ongoing keyword discovery, expansion and optimization
· Research and implement search engine optimization recommendations
· Research and analyze competitor advertising links
· Develop and implement link building strategy
· Work with editorial and marketing teams to drive SEO in content
creation and content programming
· Recommend changes to website architecture, content, linking and
other factors to improve SEO positions for target keywords.
Requirements
- Proven SEM experience managing PPC campaigns across Google, Yahoo and Bing.
- Solid understanding of performance marketing, conversion, and online
customer acquisition.
- In-depth experience with website analytics tools.
- Experience with bid management tools.
- Working knowledge of HTML, CSS, and JavaScript development and constraints.
- Up-to-date with the latest trends and best practices in SEO and SEM.
About Ftechiz Solution Pvt Ltd:
Established in 2016, Ftechiz Solutions Pvt Ltd, with it’s dedicated and skilled team of professionals, able to create dynamic and cost effective solutions for it’s clients. We believe in delivering the best services to our Clients without compromising on time or quality. Ftechiz Solutions Pvt Ltd has been launched with a clear vision to become multi-skilled and multidimensional IT service providers with a focus on high end strategic solutions and with the ultimate aim to evolve and become a leading One Stop internet strategy consulting company. We have been successful because we believe in keeping our promises and our simple but highly effective websites are known to interest the viewers greatly as the design is eye catching with the navigation easy making it a comfortable experience.


Full Stack /Node. JS/ React.js Developer:
We are looking for Developers responsible for managing the interchange of data between the server and the users. Your primary focus will be the development of all server-side logic, definition and maintenance of the central database, and ensuring high performance and responsiveness to requests from the front-end. You will also be responsible for integrating the front-end elements into the application. You will also be responsible for building rich UI components with React.
Skills: Node.js, React.js, MongoDB, Express JS, HTML, CSS.
Requirements:
- Should have experience on Node.js and frameworks available for it, such as Express or StrongLoop. Preferably Express.
- Should have experience in building Rich UI components with React.js.
- Should have hands on experience on MongoDB.
- Strong understanding of JavaScript, its quirks and workarounds.
- Knowledge of ES6, ES7 and Object-oriented and functional programming.
- Understanding the nature of asynchronous programming and its quirks and workarounds.
- Should have experience in working on Linux (Ubuntu) Environment and basic linux commands.
- Proficient understanding of code versioning tools, such as Git.
- Good to have knowledge on Redis DB.
- Good understanding of browser rendering behavior and performance.
- Good to have exposure working in an agile development environment.



Responsibilities
- Develop Magento extensions based on requirements given
- Customize 3rd party Magento extensions
- Diagnose and fix bugs in Magento websites
- Customize Magento functionalities by following Magento standards
- Optimize speed of Magento by identifying and fixing bottlenecks
Desired Skills
- Proficient in Object-Oriented PHP, MVC, Javascript, jQuery, prototype, SQL, HTML, and CSS
- Extensive in developing Magento extensions and themes
- Deep understanding of the Magento architecture and data flow
- Thorough understanding of Magento concepts of Layouts, Blocks, Models, Controllers, Helpers, Observers, etc
- Understands the ORM concepts and database structure of Magento
- Knows how to extend Magento functionalities including REST APIs
- Understands the performance bottlenecks of Magento and the ability to write optimized code
- Knowledge in Magento 2 is an added advantage
- Knowledge in Linux environments
- Good knowledge in GIT concepts and operations
- Knows how to consume 3rd party RESTful services
- Magento Developer Certification is an added advantage

