Math Problem Statement
Using a neural model with convolutional architecture, we are classifying query-document pairs into two classes: relevant (index 0), non-relevant (index 1). Suppose the current query q consists of the terms tq1 and tq2 and the current document of the terms td1,td2 and td3. Using word embeddings, we constructed the following similarity matrix: x Matrix
The convolutional network has two filters (of different sizes), given below:
F1=[0.9−0.20.40.3],F2=⎡⎣⎢0.10.6−0.10.10.30.7⎤⎦⎥.
with the following parameters of the classifier:
W=[0.70.80.10.5],b=[0.2−0.3].
a. Compute the latent semantic vector z that is the result of applying convolutions (and then 1-max pooling) with all filters. z = [Antwort, Antwort]
b. Compute the final prediction vector y^=softmax(zTW+b) for the query-document pair. Hint: In this subtask, only round the end result.
y^ = [Antwort, Antwort]
c. Knowing that the current document is relevant, compute the cross-entropy loss of the model's prediction. Use the natural logarithm and round your answer to two decimals.
cross-entropy loss: Antwort
Solution
Ask a new question for Free
By Image
Drop file here or Click Here to upload
Math Problem Analysis
Mathematical Concepts
Neural Networks
Convolutional Neural Networks
Softmax Function
Cross-Entropy Loss
Formulas
-
Theorems
-
Suitable Grade Level
Advanced
Related Recommendation
Understanding Cross-Entropy Loss for Softmax Actions in Reinforcement Learning
Understanding Cross-Entropy Loss in Logistic Regression
Understanding IDF and TF-IDF Calculations in Text Analysis
Quantitative Analysis of Credit Risk for 123 Enterprises
Optimization Techniques in Image Processing: Solving Challenges with AI