<iframe src="https://www.googletagmanager.com/ns.html?id=GTM-KVGHS6G" height="0" width="0" style="display:none;visibility:hidden"></iframe>

JP Morgan人工智能面试真题

职位分类
全部
数据相关
计算机科学
人工智能
产品经理
BQ
面试题
全部(5)
ML Domain(5)
全部(5)
ML Domain(5)
1.Using Roberta as a Classifier
2.Comparison of BERT and Word2Vec
3.LLM for QnA on Large Documents
4.Understanding of Recommendation Algorithms
5.Reason for Transition to LLM
1. Using Roberta as a Classifier
Explain why Roberta can be used as a classifier based on the fact that it is fundamentally an encoder.
2. Comparison of BERT and Word2Vec
An NLP-related question was asked to compare BERT and Word2Vec, specifically why BERT is considered better.
3. LLM for QnA on Large Documents
How would you enable Large Language Models (LLM) to perform QnA on many large documents? Are you familiar with the concept of Retrieval-Augmented Generation (RAG)?
4. Understanding of Recommendation Algorithms
Do you have knowledge of recommendation algorithms? Can you discuss how you would train a recommendation model that incorporates Natural Language Processing (NLP)?
5. Reason for Transition to LLM
What motivated you to transition to the field of Large Language Models (LLM), and do you plan to pursue a long-term career in this area?