Embeddings: Test Your Knowledge Return to pathway Which of the following features would be good candidates for an embedding? (Choose all that apply) Choose as many answers as you see fit. Daily high temperatures for Tokyo, Japan from 1999–2024 Genre categorization of 10,000 plays: comedy, tragedy, or history Scientific names for the 1.5+ million animal species that exist on Earth Lines of code in a large software project You are building a song-similarity embedding from a database of 10,000 songs, represented as one-hot encodings. How many dimensions will your embedding have? 10,000 dimensions Greater than 10,000 dimensions Fewer than 10,000 dimensions Which of the following are benefits of an embedding vector representation of feature data over a one-hot-encoded representation of the same data? (Choose all that apply) Choose as many answers as you see fit. The embedding representation has fewer weights to tune during training. The embedding representation can encode semantic relationships. A model trained on the embedding representation does not need to be evaluated with a test set. The embedding representation can be easier to display in a visualization. True or False: A hidden layer of a trained neural network can be used as an embedding. True False Which of the following statements about word2vec are true? (Choose all that apply) Choose as many answers as you see fit. Word2vec is one of many techniques used to create vector representations of words. Word2vec maps semantically similar words to embedding vectors that are close to each other in geometric space. Word2vec maps a word with multiple meanings to multiple embedding vectors. Word2vec produces contextual embeddings. Submit answers error_outline An error occurred when grading the quiz. Please try again.