CICLing-2006 7th International Conference on Intelligent Text Processing and Computational Linguistics February 19-25, 2006 Mexico City, Mexico Endorsed by the ACL www.CICLing.org/2006 PUBLICATION: LNCS: Springer Lecture Notes in Computer Science. SUBMISSION DEADLINE: Abstract: October 17; late submissions can be considered; Main text: October 24, 2005 (for registered abstracts). MODALITIES: Full paper: 12 pages, short paper: 4 pages. KEYNOTE SPEAKERS: Nancy Ide, Rada Mihalcea, 2 more to be announced, see website. EXCURSIONS: Ancient pyramids, Monarch butterflies, great cave and colonial city, and more. All tentative. See photos on www.CICLing.org. AWARDS: Best paper, best presentation, best poster, best demo. +------------------------------------------------------- | Topics +------------------------------------------------------- Computational linguistics research: Comp. Linguistics theories and formalisms, Knowledge representation, Comp. morphology, syntax, semantics, Discourse models, Machine translation, text generation, Statistical methods, corpus linguistics, Lexical resources; Intelligent text processing and applications: Information retrieval, question answering, Information extraction, Text mining, Document categorization and clustering, Automatic summarization, Natural language interfaces, Spell-checking; and all related topics. +------------------------------------------------------- | Registration fee +------------------------------------------------------- Author and public early / public on site: US$ 320 / 370 Full US$ 250 / 300 Student Discounts can be provided as exception, see website. +------------------------------------------------------- | Schedule (tentative) +------------------------------------------------------- Sunday, Wednesday, Saturday: full-day excursions; Monday, Tuesday, Thursday, Friday: talks; Monday: Welcome party & poster session. See website. ==================================================== See complete CFP and contact on www.CICLing.org/2006 ====================================================