Time Travel: 2014

Chapter 341 A brand new world behind the door

Research related to text summarization actually has a long history.

Eve Kali was not very clear about the status of Dongfang's research on text summarization.

But after coming to China, Eve Carly learned about it through some institutions where MIT has friendly cooperation with China.

Although China seems to have few projects in terms of text summarization in a broad sense.

But when it comes to pure Chinese text processing, this ancient oriental country not only has specialized projects.

Some are even covered by national plans such as the 863 Plan.

The 863 plan, as the name suggests, was naturally implemented in March 1986.

It was the first time I heard that many projects involving text summarization had even started as early as the end of the last century.

Eve Carley is stunning.

Even after thinking about it carefully, Eve Carly felt it was more terrifying.

It’s already 2014, and there is still a plan that started nearly thirty years ago that is moving forward step by step.

Making a plan is not difficult, but the difficulty lies in the execution of the plan.

It can be said that no one in the world has implemented this plan that was determined thirty years ago.

In short, Eve Carly feels that this is almost unimaginable in the United States where the two gears frequently alternate.

But it's just a text summary.

Eve Carley is not too pessimistic.

After all, the West has also put a lot of effort into text summarization.

It is even much earlier than China started research in this area.

Eve Carly remembers hearing that when she was still a student, Western research on text summarization had already begun in the early days of the Cold War.

The first to do this work were schools such as Stanford University and MIT.

But the employer behind these schools at the time was the Pentagon of the United States.

It sounds strange, but it's not surprising.

This is the fact that the current human Internet and various computer technologies were originally inextricably linked to the military.

In fact, many technologies are almost purely military-to-civilian.

Involves the direction of text summarization.

The reason why the research on text summarization was carried out at that time was to achieve technological breakthroughs in text summarization to more efficiently process information through various public materials such as news and reports. At the same time, the research on text summarization was also In order to better realize the public opinion analysis of hostile forces.

As for the hostile force, it is naturally the extremely powerful polar bear in the past.

Speaking of which, this is also a strange feature of early text summary encoding.

It basically has no processing capabilities for Chinese, a language that is used by quite a lot of people.

The processing of Russian is almost as efficient as English.

No matter what the original purpose was.

In short, research involving text summarization has received considerable attention for a long time.

Even for a long period of history, part of the research funding in this field even came directly from the military expenditures of the M country.

Later, with the advent of more efficient means of obtaining intelligence such as spy satellites, the M military's enthusiasm for research in this area gradually faded away.

Despite this, commercial enthusiasm for text summarization has remained almost unwavering.

Text, as an important carrier of information, cannot be overemphasized.

With the rapid development of the Internet in the new century, a large amount of information has emerged.

People have to pay more attention to it.

The deeper we delve into information, the more we learn about the world.

In-depth exploration of text summaries gives us greater control over information.

In terms of Lin Hui’s contribution to the text abstract.

It is no exaggeration to say that Lin Hui changed the world.

Anyway, Eve Carly doesn’t think there’s anything wrong with this statement.

When it comes to specific fields, Lin Hui's contribution to natural language processing is equally great.

Compared with traditional extractive text summarization, generative text summarization has unprecedented significance.

The reason why generative text summarization is of unprecedented significance is not just because this technology is more efficient in processing text summarization.

Of course, generative text summarization can have higher efficiency in processing text.

This improvement in efficiency is indeed of great significance to relevant users such as journalists.

But that's not what researchers care about.

A wheel that spins faster is more valuable than a wheel that spins the same but slower.

But upon closer inspection, you will find that it is actually of little value.

In fact, Eve Carly feels that the most inconspicuous thing about generative text summarization is its improvement in efficiency.

It can even be said that efficiency is only the external manifestation of the generative text summarization algorithm rather than the real core of the algorithm.

The main content of natural language processing (NLP) in the usual sense is nothing more than two parts.

One part is NLU and the other part is NLG.

The former refers to natural language understanding, and the latter refers to natural language generation.

The generative text summarization algorithm developed by Lin Hui has extremely prominent significance in both natural language understanding and natural language generation.

Generative text summarization is a new text summarization algorithm.

Compared with traditional extractive summarization, which can only rely on the extraction of original text content, it can directly generate summaries "out of nothing".

Such an algorithm has naturally achieved unprecedented heights in natural language understanding.

And this also inspires the possibility of achieving new breakthroughs in natural language generation.

Natural language generation is an extremely valuable direction.

The longer-term future of natural language generation involves more than just generating text from text.

Theoretically speaking, when neural network learning progresses to a certain point.

When the input content is not text, natural language generation can also be performed based on this.

If this is true, then natural language processing will take off in a real sense in the future.

By then, natural language processing will completely get rid of the current situation of being confined to the territory.

And to what extent can neural network learning develop to achieve new breakthroughs?

Eve Carly was extremely impressed by the deep learning mentioned by Lin Hui in the supplementary content of the paper.

Everyone knows that neural network learning needs to go deeper in order to make the model more efficient.

But how to go deeper?

This is a problem.

Although many people in the world today call neural network learning the name of deep learning.

But in fact, Eve Carly feels that these are not "deep" enough.

The corresponding efficiency of its model is far behind.

I wonder if Lin Hui can give another brand-new answer on deep learning?

If Lin Hui can indeed give a new and profound answer to deep learning.

Then he will push that door open completely.

There will be a new world behind the door.

As for what is the world behind the door?

The era of artificial intelligence has completely arrived.

Tap the screen to use advanced tools Tip: You can use left and right keyboard keys to browse between chapters.

You'll Also Like