Quiz: Applications of LSTMs—Generating Text

Test how well you understand the applications of LSTMs.

1

How is perplexity related to entropy in the context of language modeling?

A)

Perplexity is the same as entropy.

B)

Perplexity is the inverse of entropy.

C)

Perplexity is the square root of entropy.

D)

Perplexity is 2 to the power of the entropy.

Question 1 of 50 attempted

Get hands-on with 1200+ tech skills courses.