Job opportunities at iFetch
Last updated: February 25, 2022
You will be part of and report to a business-inspired team of Scientists in the design of model serving solutions and develop scalable machine learning applications;
Be a partner with the Data Science team members and project coordinators in coming up with technical solutions and project execution plans;
You will be part of the process to collect important feedback from focus groups to allow corrective actions throughout the project;
Make scientific and technical publications in reference conferences and journals.
If you are interested or keen to learn more, send your resume through our recruitment platform (soon).
This job offer has been listed in our blog. Reach out our PI via (LinkedIn/Twitter).
Follows a possible list of topics that we are keen to explore as part of the iFetch project:
This is not a comphreensive list.
This page has general information about iFetch initiatives and specific information for students at IST. Read each section carefully and for further contact send email to: João Paulo Costeira
ISR/IST is in the process of opening two research scholarships for applicants holding a Master’s degree in Engineering or Computer Science. The two scholarships are intended for developing research and implementing iFetch subsystems, related to computer vision and machine learning tasks. Specifically, candidates will work in visual search and multimodal representations for recommendation.
To register and obtain further information, fill the online form or contact João Paulo Costeira
iFetch looks forward to developing collaborative research in the context of PhD programs if this is in your horizon. These scholarships have a larger time frame (minimum 2 years) and can work as a “lever” to create strong applications for future Dual degree or Affiliated PhD’s with Carnegie Mellon University. Check the CMUPortugal program website for details or contact us in the link above.
If you are in the process of selecting a Thesis topic contact João Paulo Costeira
None for the moment.