Download Guoyin Wang. Multi-granularity Bidirectional

Document related concepts
no text concepts found
Transcript
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
GSUA20I6
De Montfort University, Leicester, UK
Multi-granularity Bidirectional Cognitive Computing
for Uncertain Data Processing
面向不确定性数据处理的多粒度双向认知计算
Guoyin Wang (王国胤)
Chongqing Key Lab. of Computational Intelligence,
Chongqing University of Posts and Telecommunications, China
Inst. of Electronic Information Tech.,
Chongqing Inst. of Green & Intelligent Tech., CAS, China
[email protected]
HTTP://CS.CQUPT.EDU.CN/WANGGY
1
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
Outline
01
Big Data
02
Cognition and Cognitive Computing
03
Artificial Intelligence with Uncertainty
04
Bidirectional Cognitive Computing (BCC)
05
Conclusions and Prospects
2
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
01
Big Data
Nature: Special Issue on “Big Data”, 2008
Science: Special Issue on “Dealing with Data”, 2011
“Data is a new class of economic asset, like currency
and gold.”
Source: World Economic Forum 2012
3
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
1KB=1024B=210B
1MB=210KB=1024KB=220B
1GB=210MB=1024MB=230B
1TB=210GB=1024GB=240B
1PB=210TB=1024TB=250B
1EB=210PB=1024PB=260B
Vasant Dhar, Data science and prediction, Communications of the ACM, 2013, 56(12):64-73.
4
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
Spaces and Sciences
Physical Space
Natural
Science
Data Space
Data Science?
Social Space
Social
Science
5
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
02
Cognition and Cognitive Computing
In science, cognition refers to mental processes. These
processes include:
• Attention
• Memory
• Producing and understanding language
• Solving problems
• Making decisions.
 S. Coren. Sensation & Perception. Harcourt College Publish, 1999.
6
Philosophy of mind ——the nature of
mind, mental events, mental
 Cognitivethe
Science
functions, mental properties,
Cognitive psychology —— mental
and their relationship to
processes: how people think, consciousness
perceive,
remember, and learn, etc. the physical body, particularly the brain
(mind-body problem).
 G.J. Feist, E.L. Rosenberg.
 J. Kim. Problems in the Philosophy of Mind.
Psychology: Making Connections. McGraw-Hill
Oxford:
Humanities/Social Sciences/Languages,
2009. Oxford University Press, 1995.
Cognitive linguistics:
AI is the science of making
intelligent machines.
Including :robot, language
identification, image recognition,
natural language processing and
expert system, etc.
 S.J. Russell, P. Norvig. Artificial
Intelligence: A Modern Approach,
Prentice Hall, 2009.
Cognitive neuroscience —biological
Cognitive semantics, Cognitive
grammar, Cognitive phonology.
 W. Croft, D. Alan Cruse.
Cognitive
Linguistics.
Cambridge:
Cognitive
anthropology
is
Cambridge University Press, 2004.
concerned with what people from
different groups know and how that
implicit knowledge changes the way
people perceive the world around
them.
 R. D‘Andrade. The Development of
Cognitive Anthropology, Cambridge:
Cambridge University Press, 1995.
substrates underlying cognition, with a
specific focus on the neural substrates of
mental processes. How psychological
cognitive functions
are produced
by the brain.
Cognitive
Science Hexagon
 M.S. Gazzaniga. The Cognitive Neurosciences III,
G. A. Miller. The cognitive
revolution:
historical perspective. TRENDS in Cognitive Sciences, Vol. 7 No.3, pp:141-144, 2003. 7
The MIT
Press,a2004.
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
03
Artificial Intelligence with Uncertainty
“A new research field in artificial intelligence in the 21st century
----Artificial Intelligence with Uncertainty”.
A fundamental problem----The expressing and processing of
uncertain concepts
D.Y. Li, Y. Du. Artificial Intelligence with Uncertainty(1st ed). London: Chapman and
Hall/CRC, 2007.
8
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
Key features
Randomness
Fuzziness
Incompleteness
Uncertainty
Unstableness
Inconsistence
……
9
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
Unidirectional Computational Cognition
Data
Unidirectional Transformation
Knowledge
Common Characteristic
Knowledge
Discovery
Knowledge discovery
is the creation
of knowledge from
structured (relational
databases, XML) and
unstructured (text,
documents, images)
sources.
Machine
Learning
Machine learning is a
scientific discipline
concerned with the design
and development of
algorithms that allow
computers to evolve
behaviors based on empirical
data, such as from sensor
data or databases.
Data
Mining
Data mining (which is
the analysis step of
Knowledge Discovery
in Databases) focuses
on the discovery of
(previously) unknown
properties from data.
10
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
Human brain
Computer
Cognitive Transformation
ID
Intension of concept
Extension of concept
Forward Cloud transformation(FCT)
C(25, 3, 0.3)
….
Birth date
?
Backward Cloud Transformation(BCT)
1
0.9
Certainty degree 
Young people
Name
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
Cloud Model
0
10
15
20
25
30
35
40
Cloud drops x
11
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
04
Bidirectional Cognitive Computing (BCC)
 Cloud Model
 Experiments on Bidirectional Cognition Processes
12
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
 Numerical Characters of Cloud Model
Three numerical characters of cloud represent a concept as a whole.
 Expected value Ex
— the expectation of
cloud drops in the universe.
0.9
0.8
0.7
0.6
(x)
 Entropy En
— describes the
uncertainty measurement
of the qualitative concept.
1
He
0.5
0.4
0.3
0.2
0.1
0
10
3En
15
20
Ex
25
30
 Hyper entropy He
Ex =25, En=3, He=0.3
— the uncertainty measurement of En, and measure
whether the concept can be formed.
35
40
x
13
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
 Cloud Transformation
 Forward Cloud Transformation (FCT)
Forward cloud transformation is a mapping from quality (intension)
to quantity (extension). It generates cloud drops according to (Ex, En, He).
C (Ex, En, He)
1
0.9
Ex yi=RN(En, He)
0.8
n=1000
0.7
Certainty degree 
Young people
C(25, 3, 0.3)
FCT
0.6
0.5
0.4
0.3
0.2
0.1
0
10
15
20
25
30
Cloud drops x (age of young people)
35
40
xi=RN(Ex, yi)
(i=1, 2,
…, n)
x1, x2, …, xn
The process of 2nd-order forward cloud transformation (2nd FCT)
14
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
► Algorithm 2nd order FCT
Input: (Ex, En, He), and the number of cloud drops n;
Output: n of cloud drops x and their certainty degree  , i.e.
Drop(xi, i), i = 1,2,…,n;
Steps:
(1) Generate a normally distributed random number Eni with
expectation En and variance He2, i.e. Eni = NORM(En, He2);
(2) Generate a normally distributed random number xi with
expectation Ex and variance Eni , i.e. xi = NORM(Ex, Eni2);
(3) Calculate i  e

(xi  Ex )2
2(En 'i )2
;
(4) xi with certainty degree of i is a cloud drop in the domain;
(5) Repeat steps (1) to (4) until n cloud drops are generated.
15
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
 Backward Cloud Transformation (BCT)
Backward cloud transformation is the model for transforming from
the quantitative values (extension) to a qualitative concept (intension). It
maps a quantity of precise data to a qualitative concept expressed by (Ex,
En, He). There are five kinds of backward cloud algorithm as follows:
1
0.9
0.8
Certainty degree 
0.7
0.6
BCT
0.5
0.4
0.3
0.2
Young people
C(25, 3, 0.3)
0.1
0
10
15
20
25
30
35
40
Cloud drops x (age of young people)
16
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
 Backward Cloud Transformation (BCT)
Backward cloud transformation is the model for transforming from
the quantitative values (extension) to a qualitative concept (intension). It
maps a quantity of precise data to a qualitative concept expressed by (Ex,
En, He). There are five kinds of backward cloud algorithm as follows:
► SBCT-1stM------(C.Y. Liu, D.Y. Li)
One-step
► SBCT-4thM------(L.X. Wang)
► MBCT-SD------(G.Y. Wang, C.L. Xu)
Multi-steps
2nd order
(Ex, En, He)
► MBCT-SR------(G.Y. Wang, C.L. Xu)
High order
► pth-BCT(p>2)------(G.Y. Wang, C.L. Xu)
(Ex, En1,…, Enp-1, He)
17
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
• Algorithm SBCT-1stM
SBCT-1stM is based on the sample variance and the first-order
sample absolute central moment to estimate the entropy En and hyper
entropy He.
1 n
S 
( xi  X )2

n  1 i 1
2
② variance
Sample Data ① mean
(x1, x2,…, xn)
② The 1st-order
sample absolute
central moment
Êx
S 2 =En2 +He 2 ,
③
E|X -X |=
2

En.
④
ˆ  S 2  En 2
He
 1 n
ˆ
En 
  xi  Ex ,
2 n i 1
1 n
E|X -X |= |xi  X |
n i 1
SBCT-1stM
18
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
► Algorithm SBCT-1stM
Input: Samples x1 , x2 , …., xn .
Output: The estimates of a qualitative concept (Ex, En, He).
Steps:
(1) Calculate the mean, variance and the first-order absolute central
moment of sample xi , respectively i.e.
1 n
1 n
1 n
2
2
X   xi,S 
|xi  X | .
 ( xi  X ) and E|X -X |= n 
n i 1
n  1 i 1
i 1
(2) According to the following equations:
S 2 =En 2 +He 2 , E|X -X |=
2

En.
Calculate the estimates of En and He respectively, i.e:
ˆ  X , En
ˆ 
Ex

1 n
ˆ  S 2  En 2 .
  xi  Ex , He
2 n i 1
 C.Y. Liu, M. Feng, X.J. Dai, D.Y. Li, “A new algorithm of backward cloud,”
Journal of System Simulation, vol. 16, no. 11, pp. 2417-2420, 2004.
19
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
• Characteristics of SBCT-1stM:
SBCT-1stM is based on the sample variance and the firstorder sample absolute central moment to estimate the entropy En
and hyper entropy He.
1
0.9
0.8
SBCT-1stM
Certainty degree 
0.7
n=1000
Sample
0.6
0.5
(25.02, 2.96, 0.35)
0.4
0.3
0.2
0.1
0
10
15
20
25
30
35
40
Cloud drops x (age of young people)
• Shortage of SBCT-1stM
 When He is very small, SBCT-1stM may fail to obtain the estimation of He
and En.
 When
He 1
> , SBCT-1stM will have a large error for the estimation of He
En 3
and En.
20
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
• Algorithm SBCT-4thM
SBCT-4thM is based on the sample variance and the fourth-order
sample central moment to estimate the entropy En and hyper entropy
He.
② variance
S2 
Sample mean ① mean
(x1, x2,…, xn)
1 n
( xi  X )2

n  1 i 1
2
Êx
③
4 =
2
S =En +He ,
 =9He 4 +18He 2 En2 +3En4
4
② The 4th-order
sample central
moment
2
④
ˆ  S 2  En 2
He
ˆ 
En
4
9(S 2 ) 2 - 4
6
1 n
( xi  X ) 4

n  1 i 1
SBCT-4thM
21
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
► Algorithm SBCT-4thM
Input: Samples x1 , x2 , …., xn .
Output: The estimates of a qualitative concept (Ex, En, He).
Steps:
(1) Calculate the mean, variance and the fourth-order central moment
of sample xi , respectively i.e.
1 n
1 n
1 n
2
2
X   xi , S 
( xi  X ) and 4 =
( xi  X ) 4 .


n i 1
n  1 i 1
n  1 i 1
(2) According to the following equations:
S 2 =En2 +He2 ,  4 =3(3He4 +6He2 En2 +En4 ).
Calculate the estimates of En and He respectively, i.e:
ˆ  X , En
ˆ 
Ex
4
9(S 2 )2 - 4
6
ˆ  S 2  En
ˆ 2.
, He
 L.X. Wang. The basic mathematical properties of normal cloud and cloud filter, Personal
Communication, May. 2011.
22
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
• Characteristics of SBCT-4thM:
SBCT-4thM is based on the sample variance and the
fourth-order sample central moment to estimate the entropy
En and hyper entropy He.
1
0.9
0.8
Certainty degree 
0.7
n=1000
Sample
SBCT-4thM
0.6
0.5
(25.02, 2.97, 0.34)
0.4
0.3
0.2
0.1
0
10
15
20
25
30
35
40
Cloud drops x (age of young people)
• Shortage of SBCT-4thM
In Step 2, SBCT-4thM may fail to estimate the entropy En and
9(S 2 )2 -
2
ˆ 2 <0, for example, n
hyper entropy He when
<0 and S  En
4
6
is a small number.
23
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
► Algorithm MBCT-SD
Input: Samples x1 , x2 , …., xn , group number m, each group sample size r.
Output: The estimates of a qualitative concept (Ex, En, He).
Steps:
n
1
ˆ  x .
(1) Calculate the sample mean of sample xi , i.e. Ex
i
n i 1
(2) Obtain the new sample from sample xi , that is, drawing m groups sample
from xi randomly and each group has r samples, and n=m*r, n, m, r are positive
integers. Calculate each group sample variance, i.e.
1 r
1 r
2
ˆ
ˆ
yˆ 
( xij  Exi ) , where, Exi   xij (i  1, 2,

r  1 j 1
r j 1
2
i
, m)
2
2
(3) Calculate the estimates of En2 and He2 from the new sample y1 , y2 ,
respectively, i.e:
, y m2
ˆ 2  1 4( EY
ˆ 2 )2  2 DY
ˆ 2 , He
ˆ 2  EY
ˆ 2  En
ˆ 2.
En
2
m
1
1 m 2 ˆ 2 2
2
2 ˆ 2
ˆ
where, EY   yˆi , DY 
 ( yˆi EY ) .
m i 1
m  1 i 1
 G. Y. Wang, C.L. Xu, Q.H. Zhang, X.R. Wang. A Multi-step Backward Cloud Generator Algorithm, RSCTC2012,
LNAI 7413, pp: 313-322, 2012.
24
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
• Algorithm MBCT-SR
C ( Ex, En, He)
FCT
x1, x2,……, xn
xx1r ,;
11
12
n
1
Sample with
ˆ  x .
Ex
i
replacement
n i 1
xi=RN(Ex, yi)
yi=RN(En, He)
…,
ˆ 2 , He
ˆ 2
En
y12, y22,…, ym2
y12
x21, x22,…, x2r; …; xm1, xm2,…, xmr .
y2 2
…
ym2
• MBCT-SR is also a two-step method to estimate the entropy En and hyper
entropy He.
25
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
► Algorithm MBCT-SR
Input: Samples x1 , x2 , …., xn , group number m, each group sample size r.
Output: The estimates of a qualitative concept (Ex, En, He).
Steps:
1 n
X   xi .
(1) Calculate the sample mean of sample xi , i.e.
n i 1
(2) Obtain the new sample from sample xi , that is, drawing m groups sample
with replacement from xi randomly and each group has r samples (n, m, r are
positive integers). Calculate each group sample variance, i.e.
r
1 r
1
2
ˆ ) , where, Ex
ˆ   x (i  1, 2,
yˆ 
( xij  Ex

i
i
ij
r  1 j 1
r j 1
2
i
, m)
(3) Calculate the estimates of En2 and He2 from the new sample
respectively, i.e:
y12 , y22 ,
, y m2
ˆ 2  1 4( EY
ˆ 2 )2  2 DY
ˆ 2 , He
ˆ 2  EY
ˆ 2  En
ˆ 2.
En
2
1 m 2 ˆ 2
1 m 2 ˆ 2 2
2
ˆ
where, EY   yˆi , DY 
 ( yˆi EY ) .
m i 1
m  1 i 1
 Chang Lin XU, Guo Yin WANG, Backward Cloud Transformation Algorithm for Realizing
Stability Bidirectional Cognitive Mapping, PR&AI, 2013, 26(7):634-642.
26
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
► Algorithm pth order BCT (p>2)
—— pth order Backward Cloud Transformation Algorithm
Input: Samples x1 , x2 , …., xn , group number m, each group sample size r.
Output: The estimates of (Ex=En1, En2, En3, ……, Enp-1, Enp, He).
Steps:
n
1
(1) Calculate the sample mean of sample xi , i.e. X   xi .
n i 1
(2) Obtain the new sample from sample xi , that is, drawing m groups sample
with replacement from xi randomly and each group has r samples (n, m, r are
positive integers). Calculate each group sample variance, i.e.
1 r
1 r
2
ˆ
ˆ
yˆ 
( xij  Exi ) , where, Exi   xij (i  1, 2,

r  1 j 1
r j 1
2
i
Let Y  y12 , y22 ,
, m)
, y m2 .
Guoyin Wang, Changlin Xu,Qinghua Zhang,Xiaorong Wang, p-order Normal Cloud Model
Recursive Definition and Analysis of Bidirectional Cognitive Computing, Chinese Journal of
Computers, 2013, 36(11):2316-2329.
27
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
(3) Calculate the estimates of En2 ,
sample respectively, i.e:
, En2p1 , En 2p , He2 from the new
2
for ( p =2; p<P; p++)
ˆ 2  1 4( EY
ˆ 2 )2  2 DY
ˆ 2,
{ En
p
2
1 m 2 ˆ 2
1 m 2 ˆ 2 2
2
ˆ
where, EY   yˆi , DY 
 ( yˆi EY )
m i 1
m  1 i 1
Input a group new value: mm, rr, and let m=mm, r=rr;
ˆ 2  1 4( EY
ˆ 2 )2  2 DY
ˆ 2 ,
En
p 1, i
p 1, i
p 1, i
2
ˆ 2  En
ˆ 2 , (i  1, 2, , m)
uˆ 2p 1,i  EY
p 1, i
p 1, i
r
r
ˆ 2  1  yˆ 2 , DY
ˆ 2  1  ( yˆ 2 EY
ˆ 2 )2
where, EY
p 1, i
j
p 1,i
j
p 1,i
r j 1
r  1 j 1
Let ,Y  uˆ 2p 1,1 , uˆ 2p 1,2 ,......, uˆ 2p 1, m ;
}
Estimate the En2p , He2p from the sample set Y, i.e.
ˆ 2  1 4( EY
ˆ 2 )2  2 DY
ˆ 2 , He
ˆ 2  EY
ˆ 2  En
ˆ 2,
En
p
p
p
2
1 m 2 ˆ 2
1 m 2 ˆ 2 2
2
ˆ
where, EY   yˆi , DY 
 ( yˆi EY ) .
m i 1
m  1 i 1
28
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
 Generic Normal Cloud Transformation

2nd-order Generic Forward Cloud Transformation
(2nd order G-FCT)

2nd-order Generic Backward Cloud Transformation
(2nd order G-BCT)

pth-order Generic Forward and Backward Cloud
Transformation (pth order G-FCT, pth order G-BCT)
29
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS

2nd order G-FCT
 2nd order FCT
Concept

(i=1, 2, …, n)
C (Ex, En, He)
Ex yi=RN(En, He)
 2nd order G-FCT
Concept

C (Ex, En, He)
Ex yi=RN(En, He)
(i=1, 2, …, m)
xij=RN(Ex, yi)
xi=RN(Ex, yi)
Cloud Drops

x1,
x2,
…,
xn
The process of 2nd order FCT
(j=1, 2, …, ri)
Cloud Drops

x11, x12, …, x1r1;
x21, x22, …, x2r2;
…… ;
xm1, xm2,…, xmrm;
The process of 2nd order G-FCT
30
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
► Algorithm 2nd order G-FCT
Input: (Ex, En, He), and the number of cloud drops n; the parameters m, ri
(i=1, 2 ,… , m)
Output: n cloud drops xij with certainty degree  (xij ) (i=1,2,…,m; j=1,2,…,ri)
Steps:
(1) Generate m normally distributed random numbers yi with
expectation En and variance He2, i.e. yi = NORM(En, He2);
(2) For each y in step 1, generate ri normally distributed random numbers xij
with expectation Ex and variance yi , i.e. xi = NORM(Ex, yi2);
(3) Calculate  ( xij )  e

(xij  Ex )2
2yi2
;
(4) xij with certainty degree  (xij ) is a 2nd-order generic normal cloud drop in
the domain;
(5) Repeat steps (1) to (4) until n cloud drops are generated.
 G.Y. Wang, C.L. Xu, D.Y. Li . Generic Normal Cloud Model, Information Sciences, (2014)
31
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
 Let (Ex, En, He)=(25, 3.0, 0.55), r1= r2= … = rm=r, n=5000 .
When m, r take different values respectively (m*r=n), the shapes of the
2nd order G-FCT are as follows
0.6


1
0.8
0.8
0.6
0.6

0.6
1

0.8
1
2nd order
0.8
FCT
1
0.4
0.4
0.4
0.4
0.2
0.2
0.2
0.2
0
10
20
30
0
10
40
20
30
0
10
40
20
x
x
(a) m=5000, r=1
30
0
10
40
20
x
(b) m=500, r=10
30
40
x
Normal
(d) m=50,
r=100
(c) m=100, r=50
Distribution
1
1
0.8
0.8
0.8
0.8
0.6
0.6
0.6
0.6



1

1
0.4
0.4
0.4
0.4
0.2
0.2
0.2
0.2
0
10
20
30
x
(e) m=10, r=500
40
0
10
20
30
x
(f) m=5, r=1000
40
0
10
20
30
x
(g) m=2, r=2500
40
0
10
30
32
20
40
x
(h) m=1, r=5000
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
Conclusions
① When m=n, r1= r2= … = rm =1, for each yi, only one cloud drop xij will
be obtained, that is, a cloud drop xij corresponds to a yi, then n cloud
drops are from n different normal distributions respectively, so, the
2nd order G-FCT will be a 2nd order FCT proposed in reference [1]
② When m=1, r1=n, there will be only one y1=RN(En, He), then the 2nd
order G-FCT will be a normal distribution N(Ex, y1).
[1] D.Y. Li, Y. Du. Artificial Intelligence with Uncertainty(1st ed). London: Chapman and Hall/CRC, 2007.
33
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
 Experiments on Bidirectional Cognition Processes
(1) Cognizing a concept over and over again
(2) Cognizing process with the increasing of sample size n
(3) Many people’s mutual cognizing process for a concept
(4) Multi granularity concept cognition
(5) Image segmentation Based on Bidirectional
Computational Cognition
34
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
(1) Cognizing a concept over and over again
1
FCT
0.9
0.8
0.7
BCTi
C(Exi, Eni, Hei)

C(Ex, En, He)
0.6
L times
0.5
0.4
0.3
0.2
0.1
0
-15
BCTi
☺
-10
-5
0
x
5
10
Simulate the cognitions of SBCT-1stM, SBCT-4thM,
MBCT-SD, MBCT-SR and 3rd-BCT respectively with
different cycle number L .
35
15
1
0.9
C1(24.96, 2.98, 0.28)
 When the number of cycles L=1
Certainty degree 
SBCT-1stM
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0
15 17
19
21
23
25 27
29
31
33
35
Cloud drops x (Ages of young people)
1
0.9
SBCT-4thM
C2(25.02, 2.95, 0.21)
Certainty degree 
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0
15
0.7
0.6
MBCT-SD
0.5
C3(24.98, 3.01, 0.09)
L =1
0.4
0.3
Certainty degree 
Certainty degree 
21
23
25
27
29
31
33
35
(Ages of young people)
0.9
0.8
0.2
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0.1
0
15 17
35
Conclusion: When the qualitative concept
is quite clear, the cognition results of SBCT1stM, SBCT-4thM and 3rd-BCT have some
excursion,while the MBCT-SD and
MBCT-SR’s cognition results are very good.
MBCT-SR
C4(25.03, 2.97, 0.09)
3rd-BCT
C5(25.02, 2.98, 0.32, 0.02)
Certainty degree 
Clear concept
19
21
23
25 27
29
31
33
35
Cloud drops x (Ages of young people)
Certainty degree 
0
15 17 19 21 23 25 27 29 31 33
Cloud drops x (Ages of young people)
19
1
0.9
C(25, 3, 0.1)
17
Cloud drops x
1
1
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0
15 17 19 21 23 25 27 29 31 33 35
Cloud drops x
1
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0
15 17 19 21 23 25 27 29 31 33 35
Cloud drops x
36
1
0.9
SBCT-1stM
C1(24.92, 2.96, 0.34)
 When the number of cycles L=50
Certainty degree 
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0
16
18
20
22
Cloud drops x
24
26
28
30
32
34
(Ages of young people)
1
0.9
Certainty degree 
0.8
SBCT-4thM
0.7
0.6
0.5
0.4
0.3
0.2
C2(25.03, 2.98, 0.30)
0.1
0
16
18
20
22
1
1
0.9
0.9
0.8
0.8
0.7
0.6
0.5
L =50
0.4
MBCT-SD
C3(24.98, 3.01, 0.09)
0.3
Certainty degree 
C(25, 3, 0.1)
Certainty degree 
Cloud drops x
26
28
30
32
34
36
(Ages of young people)
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0.2
0
15
0.1
MBCT-SR
C4(24.97, 2.85, 0.091)
Conclusion: When the qualitative
concept is quite clear, the cognition
results of SBCT-1stM, SBCT-4thM and
3rd-BCT have some excursion,while
the MBCT-SD and MBCT-SR’s
cognition results are very good.
3rd-BCT
C5(24.93, 2.97, 0.29, 0.009)
Certainty degree 
Clear concept
17
19
21
23
25
27
29
31
33
35
Cloud drops x (Ages of young people)
35
Certainty degree 
0
15 17 19 21 23 25 27 29 31 33
Cloud drops x (Ages of young people)
24
1
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0
15 17 19 21 23 25 27 29 31 33 35
Cloud drops x
1
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
37
0
15 17 19 21 23 25 27 29 31 33 35
Cloud drops x
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
• Conclusions
▬ MBCT-SD and MBCT-SR over perform SBCT-1stM,
SBCT-4thM and 3rd-BCT for cognizing uncertain
concepts in all cases.
▬ MBCT-SD and MBCT-SR could be used to construct
stable bidirectional cognitive mapping between
concept’s intension and extension together with FCT.
▬ The 5 Backward Cloud Transformations could be
used to cognize a concept from extension to intension.
They can estimate different kinds of people
respectively.
38
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
 Experiments on Bidirectional Cognition Processes
(1) Cognizing a concept over and over again
(2) Cognizing process with the increasing of sample size n
(3) Many people’s mutual cognition process for a concept
(4) Multi granularity concept cognition
(5) Image segmentation Based on Bidirectional
Computational Cognition
39
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
?
?
C(Ex, En, He)
When the sample size n is
increasing constantly
?
?
?
40
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
1
0.9
 Clear concept
Certainty degree 
0.8
C(25, 3, 0.1)
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0
15
17
19
21
23
25
27
29
31
33
35
Cloud drops x (Ages of young people)
1
0.9
0.8
Certainty degree 
 Uncertain concept
C(25, 3, 0.55)
0.7
0.6
0.5
0.4
0.3
0.2
0.1
 Confusing concept
C(25, 1, 0.8)
Certainty degree 
0
10 12 14 16 18 20 22 24 26 28 30 32 34 36 38 40
Cloud drops x
1
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0
15 17 19 21 23 25 27 29 31 33 35
Cloud drops x
41
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
1
0.9
0.8
Certainty degree 
• The cognizing process of MBCT-SR
when the sample size n is increasing
C(25, 3, 0.55)
0.7
0.6
0.5
0.4
0.3
0.2
Uncertain
concept
C4(20, 0, 0)
0.4
0.2
12
14
16
18
20
22
24
26
28
30
Cloud drops x
C4(24.74, 3.15, 0.78)
1
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0
10 12 14 16 18 20 22 24 26 28 30 32 34 36 38 40
Cloud drops x
C4(24.91, 3.03, 0.53)
Certainty degree 
Certainty degree 
n=50
1
n=400
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0
10 12 14 16 18 20 22 24 26 28 30 32 34 36 38 40
Cloud drops x
1
n=10
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0
10 12 14 16 18 20 22 24 26 28 30 32 34 36 38 40
Cloud drops x
C4(25.02, 3.01,0.54)
Certainty degree 
0.6
1
n=2
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0
10 12 14 16 18 20 22 24 26 28 30 32 34 36 38 40
Cloud drops x
C4(24.37, 2.14, 0.80)
Certainty degree 
Certainty degree 
Certainty degee 
n=1
0.8
0
10
0
10 12 14 16 18 20 22 24 26 28 30 32 34 36 38 40
Cloud drops x
C4(25.87, 1.92, 1.21 )
1.2
1
0.1
1
n=1000
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0
10 12 14 16 18 20 22 24 26 28 30 32 34 36 38 40
Cloud drops x
42
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
• Conclusions
▬ The cognition processes of the 5 backward cloud
transformations have some differences when the
sample size n is increasing.
 For a clear concept, MBCT-SD and MBCT-SR have good
cognition results, while the cognition results of SBCT-1stM,
SBCT-4thM and 3rd-BCT have some excursion.
 For a confusing concept, SBCT-1stM, SBCT-4thM, MBCT-SD
and MBCT-SR have good cognition results, while the
cognition result of 3rd-BCT has much excursion.
▬ SBCT-1stM, SBCT-4thM and MBCT-SD may fail to
cognize a concept when there are only few samples (a
very small n).
43
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
 Experiments on Bidirectional Cognition Processes
(1) Cognizing a concept over and over again
(2) Cognizing process with the increasing of sample size n
(3) Many people’s mutual cognition process for a concept
(4) Multi granularity concept cognition
(5) Image segmentation Based on Bidirectional
Computational Cognition
44
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
(3) Many people’s mutual cognition process for a concept
Simulate the cognizing process when a qualitative concept is
passed from one person to another over and over again.
.....
45
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
Experiment method:
Ci(Exi, Eni, Hei)
FCT ↔BCTi
FCT ↔BCTj
C(Ex, En, He)
Cj(Exj, Enj, Hej)
……
FCT ↔BCTk
Ck(Exk, Enk, Hek)
46
1
0.9
Certainty degree 
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0
10 12 14 16 18 20 22 24 26 28 30 32 34 36 38 40
Cloud drops x
MBCT-SD
C(25, 3, 0.55)
1
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0
10 12 14 16 18 20 22 24 26 28 30 32 34 36 38 40
Cloud drops x
SBCT-4thM
C3(25.02, 3.02, 0.54)
C2(24.97, 2.97, 0.52)
SBCT-1stM
MBCT-SR
C4(25.03, 2.98, 0.56)
C1(24.98, 3.04, 0.53)
1
0.9
0.8
Certainty degree 
Uncertain
concept
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0
10 12 14 16 18 20 22 24 26 28 30 32 34 36 38 40
Cloud drops x
Certainty degree 
Certainty degree 
0.8
1
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0
10 12 14 16 18 20 22 24 26 28 30 32 34 36 38 40
Cloud drops x
Certainty degree 
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
1
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0
10 12 14 16 18 20 22 24 26 28 30 32 34 36 38 40
Cloud drops x
47
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
• Conclusions
▬ A concept could be passed among different kinds of
people(BCTs) with excursion in some degree.
48
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
 Experiments on Bidirectional Cognition Processes
(1) Cognizing a concept over and over again
(2) Cognizing process with the increasing of sample size n
(3) Many people’s mutual cognition process for a concept
(4) Multi granularity concept cognition
(5) Image segmentation Based on Bidirectional
Computational Cognition
49
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
Prof. Leslie Valiant at Harvard University
Turing Award 2010
Current Research Interests
“A fundamental question for artificial intelligence is
to characterize the computational building blocks
that are necessary for cognition.”
Information/Knowledge Granules
http://people.seas.harvard.edu/~valiant/researchinterests.htm
50
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
Members age clustering of Chinese Academy of Engineering
0.07
0.06
0.05
C2(76.4,5.9,0.29)
0.04
0.03
0.02
C1(53,3.3,0.16)
0.01
0
40
50
60
70
age
80
90
100
Two concepts generated by A-GCT
Concept tree generated by A-GCT
at different granularity level
51
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
Deep-Learning
• Based on connectionism in the 1980's.
• Adds the assumption: factors are
organized into multiple levels of
abstraction or composition
• Deep architectures are composed of
multiple levels of non-linear
operations
G. E. Hinton, R. R. Salakhutdinov, Reducing the Dimensionality of Data with Neural Networks, Science, 313:504-507, 2006
G.E. Hinton, S. Osindero, Y. Teh, A fast learning algorithm for deep belief nets, Neural Computation, 18:1527-1554, 2006
Y. Bengio, P. Lamblin, D. Popovici, H. Larochelle, Greedy Layer-Wise Training of Deep Networks, NIPS 2006, pp. 153-160
M.A. Ranzato, etc., Efficient Learning of Sparse Representations with an Energy-Based Model, NIPS 2006, pp. 1137-1144
Y. Bengio, Learning Deep Architectures for AI, Foundations and Trends in Machine Learning, 2(1):1-127, 2009
G. Anthes, Deep Learning Comes of Age, Communications of the ACM, 56(6): 13-15,2013
52
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
ANFIS : adaptive-network-based fuzzy inference system
A1
B1
f1  p1 x  q1 y  r1
f 
A2
x
Y
X
 w1 f1  w2 f 2
B2
X
w1 f1  w2 f 2
w1  w2
Y
y
Fuzzy Inference (Type 3)
J.-S. R. Jang, “ANFIS: Adaptive-network-based fuzzy inference system,”
IEEE Trans. Syst., Man, Cybern., vol. 23, pp. 665–685, 1993.
53
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
w1 f1  w2 f 2
f 
w1  w2
 w1 f1  w2 f 2
x y
A1
x
A2
П
W1
N
W1
W1 f1
Σ
f
B1
П
y
B2
W2
N
W2 f2
W2
x y
ANFIS
54
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
– TMLNN: Triple-Valued or Multiple-Valued Logic Neural Network
X1
Classical neuron
X2
W1
W2
……
Σ
θ
Y=f(Σ(WiXi)-θ)
Wn
Xn
X1
Triple-valued or multiple-valued
logic neuron (TMLN)
X2
W1
W2
……
I
f(I)
Y=g(f(I))
Wn
Xn
G.Y. Wang, H.B. Shi, Three Valued Logic Neural Network, Proc. of Int. Conf. on
Neural Information Processing, Hong Kong, 1112-1115, 1996.
G.Y. Wang, H.B. Shi, TMLNN: Triple-Valued or Multiple-Valued Logic Neural
Network, IEEE Trans. on Neural Networks, 9(6):1099-1117, 1998.
55
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
Multiple-Valued Logic “Exclusive OR(XOR)” Experiment
A
B
XOR
(A, B)
XOR
-1
-1
-1
-4.5
-1
0
0
-1
1
1
0
-1
0
0
0
0
0
1
0
1
-1
1
1
0
0
1
1
-1
1
N_XOR
2.5
2.6
N_NAB
A
0
N_ANB
-1.9
ANB
NAB
1.6
A
-1.5 1.5
-1
-1.8
-1
B
0
1
-1.6
B
56
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
MGrC for Big Data
Traditional DM: Granularity Space Optimization
----Searching a suitable granularity level for problem solving.
Traditional GrC: Granularity Level Switching
----Solving a problem in different granularity levels.
New Direction: Multi-Granularity Joint Problem Solving
----Solving a problem in Multi-Granularity levels jointly and
simultaneously.
57
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
 Experiments on Bidirectional Cognition Processes
(1) Cognizing a concept over and over again
(2) Cognizing process with the increasing of sample size n
(3) Many people’s mutual cognition process for a concept
(4) Multi granularity concept cognition
(5) Image segmentation Based on Bidirectional
Computational Cognition
58
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
 Image segmentation
The original images
59
 Results of image segmentation
Fig.1 Expected segmentation results
Fig.2 Segmentation results by K-means
Fig.3 Segmentation results by SBCT-1stM*
Fig.4 Segmentation results by MBCT-SR
* Kun Qin, Kai Xu, Yi Du, Deyi Li. An Image Segmentation Approach Based on Histogram Analysis Utilizing Cloud Model.
2010 Seventh International Conference on Fuzzy Systems and Knowledge Discovery (FSKD 2010):524-528.
60
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
 Results of image segmentation
Error Rate (%)
Method
Bird and
branch
Starfish
Flower
Butterfly
Leopard
and branch
K-means
69.335
52.183
29.761
45.122
47.558
SBCT1stM
29.756
32.946
16.423
36.901
44.246
MBCT-SR
10.401
10.939
11.462
15.154
19.126
61
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
 Test on noisy images
The images with noise
(5% salt & pepper noise)
62
 The test of noise immunity
Fig.1 Expected segmentation results
Fig.2 Segmentation results by K-means
Fig.3 Segmentation results by SBCT-1stM
Fig.4 Segmentation results by MBCT-SR
63
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
 Test on noisy images
Error Rate(%)
Method
Bird and
branch
Starfish
Flower
Butterfly
Leopard
and branch
K-means
86.535
74.132
68.743
74.571
68.240
SBCT-1stM
60.041
46.165
16.094
85.997
43.563
MBCT-SR
11.968
12.791
12.603
21.861
40.552
64
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
0.015
frequence
0.01
0.005
0
0
50
100
150
gray
200
250
300
Gray histogram
laser cladding image
0.015
frequence
0.01
0.005
0
0
50
100
150
gray
200
250
300
Three concept generated by A-GCT:
C1(81.3, 15.3, 1.81), C2(172.3, 31.7,3.74), C3(253.2, 1.3, 0.11)
65
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
Medical Image Segmentation
[1] C.M. Li. Distance Regularized Level Set Evolution and Its Application to Image Segmentation, IEEE Trans. Image Process. 19
(12) (2010) 3243-3254.
[2] Z. Zivkovic. Gentle ICM energy minimization for Markov random fields with smoothness-based priors, Journal of Real-Time
Image Processing. 11 (1) (2016) 235-246.
[3] S.F. Dai, et al. A novel approach of lung segmentation on chest CT images using graph cuts, Neurocomputing. 1 (2015) 799-807.
[4] M.B. Salah, A. Mitiche, I.B. Ayed, Multiregion Image Segmentation by Parametric Kernel Graph Cuts, IEEE Trans. Image
Process. 20 (2) (2011) 545-557.
66
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
05
Conclusions and Prospects
Conclusions:
 The relationship of cognitive science and artificial intelligence is studied.
 Bidirectional cognitive computing (BCC) is proposed.
 Cloud Model is studied as a case study of BCC.
 Some human cognition processes are implemented successfully with BCC.
 BCC is used to solve some real life key problems like image segmentation.
67
CHONGQING UNIVERSITY OF POSTS AND TELECOMMUNICATIONS
68
Related documents