Comprehension and generation are the two complementary aspects of natural language processing (NLP). However, much of the research in NLP until recently has focussed on comprehension. Some of the reasons for this almost exclusive emphasis on comprehension are (1) the belief that comprehension is harder than generation, (2) problems in comprehension could be formulated in the AI paradigm developed for problems in perception, (3) the potential areas of applications seemed to call for comprehension more than generation, e.g., question-answer systems, where the answers can be presented in some fixed format or even in some non-linguistic fashion (such as tables), etc. Now there is a flurry of activity in generation, and we are definitely going to see a significant part of future NLP research devoted to generation. A key motivation for this interest in generation is the realization that many applications of NLP require that the response produced by a system must be flexible (i.e., not produced by filling in a fixed set of templates) and must often consist of a sequence of sentences (i.e., a text) which must have a textual structure (and not just an arbitrary sequence of sentences containing the necessary information). As the research in generation is taking roots, a number of interesting theoretical issues have become very important, and these are likely to determine the paradigm of research in this "new" area.