å°ãåã«ãããã³ã倧åŠã®ææã§ã«ãŒãã®ãŒã¡ãã³å€§åŠã®å士å·ãååŸãããžã§ã³ããã£ã»ããã¡ã³ã³ãã€ã³ããã¯ã¹ã«æ¥ãŸããã 圌ã¯ããã£ãŒããã¥ãŒã©ã«ãããã¯ãŒã¯ã®ãã¬ãŒãã³ã°ã§GPUã¡ã¢ãªã®å¶éã®åé¡ãåé¿ã§ããã³ãŒãã£ã³ã°ã¢ã«ãŽãªãºã ã«ã€ããŠè¬æŒããŸããã
-ç§ã¯ããã³ã倧åŠã®ããã€ãã®ã°ã«ãŒãã®ã¡ã³ããŒã§ãã ãããã®1ã€ã¯ãã³ã³ãã¥ãŒã¿ãŒã·ã¹ãã ãšãããã¯ãŒã¯ã°ã«ãŒãã§ãã ç§èªèº«ã®ã°ã«ãŒããEcoSystem GroupããããŸãã ã°ã«ãŒãã®ååã瀺ãããã«ãç§ã¯æ©æ¢°åŠç¿ã®å°é家ã§ã¯ãããŸããã ãããããã¥ãŒã©ã«ãããã¯ãŒã¯ã¯çŸåšéåžžã«äººæ°ããããã³ã³ãã¥ãŒã¿ãŒã¢ãŒããã¯ãã£ãšãããã¯ãŒã¯ãã³ã³ãã¥ãŒã¿ãŒã·ã¹ãã ãæ±ã人ã
ã¯ããããã®ã¢ããªã±ãŒã·ã§ã³ãç¶ç¶çã«æ±ãå¿
èŠããããŸãã ãããã£ãŠãç§ã¯ãã®1幎åãã2幎éããã®ãããã¯ã§å¿ããããŠããŸãã
ã¡ã¢ãªãããã»ããµãã£ãã·ã¥ã§é©åã«å§çž®ããæ¹æ³ã説æããŸãã ãããã«ãŒãã®ãŒã¡ãã³ã§ã®ã¢ã¡ãªã«ã®å士è«æã®ããŒãã§ããã ããã¯ããã¥ãŒã©ã«ãããã¯ãŒã¯ãªã©ã®ä»ã®ã¢ããªã±ãŒã·ã§ã³ã«åæ§ã®ã¡ã«ããºã ãé©çšããããšãã«ééããåé¡ãç解ããã®ã«åœ¹ç«ã¡ãŸãã
ã³ã³ãã¥ãŒã¿ãŒã¢ãŒããã¯ãã£ã®äž»ãªåé¡ã®1ã€ã¯ãåªãããšãã«ã®ãŒå¹çãã©ã¡ãŒã¿ãŒãåããé«æ§èœã·ã¹ãã ïŒã°ã©ãã£ãã¯ã«ãŒããã³ããã»ããµãŒãé»è©±ãã©ãããããïŒãååŸããããšã§ãã
çŸæç¹ã§ã¯ãã¡ã¢ãªã«å¶éããªãéããããªãé«ãããã©ãŒãã³ã¹ãç°¡åã«åŸãããšãã§ããŸãã å¿
èŠã«å¿ããŠããšããœããããã³ã³ãã¥ãŒã¿ãŒãå
¥æã§ããŸãã åé¡ã¯ãããã«ã©ã®ãããã®é»æ°ãè²»ããå¿
èŠããããã§ãã ãããã£ãŠãäž»ãªåé¡ã®1ã€ã¯ã䜿çšå¯èœãªãªãœãŒã¹ã§è¯å¥œãªããã©ãŒãã³ã¹çµæãåŸãããšãšåæã«ããšãã«ã®ãŒå¹çã®ãã©ã³ã¹ã厩ããªãããšã§ãã
ãšãã«ã®ãŒå¹çåãžã®éã®äž»ãªåé¡ã®1ã€ã¯ãã¯ã©ãŠãã³ã³ãã¥ãŒãã£ã³ã°ãããŸããŸãªã¢ãã€ã«ããã€ã¹ã§äœ¿çšãããå€ãã®éèŠãªã¢ããªã±ãŒã·ã§ã³ã«ã転éãšã¹ãã¬ãŒãžã®äž¡æ¹ã®é倧ãªããŒã¿ã³ã¹ããããããšã§ãã ãããã¯ãææ°ã®ããŒã¿ããŒã¹ãã°ã©ãã£ãã¯ã«ãŒãããããŠãã¡ããæ©æ¢°åŠç¿ã§ãã ããã«ã¯ãã«ãŒãã«ãããããã¯ãŒã¯ãªãœãŒã¹ãŸã§ãã¹ã¿ãã¯ã®ãã¹ãŠã®ã¬ãã«ã®éåžžã«æ·±å»ãªãªãœãŒã¹ãå¿
èŠã§ãã
ããŸããŸãªæé©åãå®è¡ããå¿
èŠãããå Žåã«çºçããéèŠãªåé¡ã®1ã€ã次ã«ç€ºããŸããå®éã«ã¯ãããã¿ã€ãã®ãªãœãŒã¹ãå¥ã®ã¿ã€ãã«çœ®ãæããããšãã§ããŸãã ãããŸã§ãã³ã³ãã¥ãŒã¿ãŒã¢ãŒããã¯ãã£ã§ã¯ãå ç®ãä¹ç®ãç®è¡æŒç®ãªã©ã®ã³ã³ãã¥ãŒãã£ã³ã°ãªãœãŒã¹èªäœãéåžžã«é«äŸ¡ã§ããã ãã ãããã®ç¶æ³ã¯è¿å¹Žå€åããŠãããããã¯ãããã»ããµã³ã¢ã®éçºãã¡ã¢ãªãžã®ã¢ã¯ã»ã¹é床ãããã¯ããã«éãé²ãã ããã§ãã
ãã®çµæããšãã«ã®ãŒã«é¢ããå ç®ã®1ã€ã®ç®è¡æŒç®ã§ã¯ãçŽ1ãã³ãžã¥ãŒã«ã®ã³ã¹ããããããŸãã ãã®å Žåãæµ®åå°æ°ç¹ã䜿çšãã1ã€ã®æäœã§ããæµ®åå°æ°ç¹ã«ã¯ãçŽ20ãã³ãžã¥ãŒã«ã®ã³ã¹ããããããŸãã ã¡ã¢ãªãã4ãŸãã¯8ãã€ããèªã¿åãããå Žåãå°ãªããšã2æ¡ä»¥äžã®ãšãã«ã®ãŒãæ¶è²»ããŸãã ãããŠãããã¯é倧ãªåé¡ã§ãã ã¡ã¢ãªãæäœããããšãããšãããªãè²»çšãããããŸãã ãŸããã©ã®ããã€ã¹ã«ã€ããŠè©±ããŠãããã¯é¢ä¿ãããŸããã ç¶æ³ã¯ãæºåž¯é»è©±ã§ãã倧èŠæš¡ãªã¯ã©ã¹ã¿ãŒãã¹ãŒããŒã³ã³ãã¥ãŒã¿ãŒã§ãåãã§ãã
ãããããçŸåšã®æºåž¯é»è©±ã§ãããšãã«ã®ãŒè³æºã«ååã«äœ¿çšã§ããªãéåžžã«å€ãã®è³æºãç¶ãããšã«ãªããŸãã ææ°ã®æºåž¯é»è©±ã䜿çšããå ŽåãAndroidã§ãiPhoneã§ããããŒã¯æã®ã¡ã¢ãªãšã³ã¢éã®å©çšå¯èœãªåž¯åå¹
ã¯çŽ50ïŒ
ãã䜿çšã§ããŸããã ãããè¡ããªããšãé»è©±ãéç±ãããã誰ãéç±ãããŸãããã¡ã¢ãªãšã³ã¢éã®éä¿¡äžã«ãã¹åšæ³¢æ°ãäœäžããããã©ãŒãã³ã¹ãäœäžããŸãã
ããªãããŒãªæé©åãé©çšãããªãéããå€ãã®ãªãœãŒã¹ã¯çŸåšäœ¿çšã§ããŸããã
ããŸããŸãªã¬ãã«ã§ããŸããŸãªãªãœãŒã¹ã®äžè¶³ã«å¯ŸåŠãã1ã€ã®æ¹æ³ã¯ãããŒã¿ãå§çž®ããããšã§ãã ããã¯æ°ããæé©åã§ã¯ãªãããããã¯ãŒã¯ãšãã©ã€ãã®äž¡æ¹ã«æ£åžžã«é©çšãããŠãããããŸããŸãªãŠãŒãã£ãªãã£ã䜿çšããŠããŸãã Linuxã§ã¯ãå€ããgzipãŸãã¯BZip2ãŠãŒãã£ãªãã£ã䜿çšãããšããŸãããã ãããã®ãŠãŒãã£ãªãã£ã¯ãã¹ãŠããã®ã¬ãã«ã§éåžžã«ããŸãé©çšãããŠããŸãã éåžžãã¢ã«ãŽãªãºã ã¯ãããã³ãšã³ã³ãŒãã£ã³ã°ãŸãã¯Lempel-Zivã«åºã¥ããŠé©çšãããŸãã
ãããã®ã¢ã«ãŽãªãºã ã¯ãã¹ãŠãéåžžã倧éã®èªåœãå¿
èŠãšããŸããåé¡ã¯ãã¢ã«ãŽãªãºã ãéåžžã«äžè²«æ§ããããçŸä»£ã®ã¢ãŒããã¯ãã£ã«ããŸãé©åãããéåžžã«äžŠè¡ããŠããããšã§ãã æ¢åã®ããŒããŠã§ã¢ãèŠããšãå°ãªããšã5幎åã®æåã®äœæ¥ã®äžéšãŸã§ã¯ãã¡ã¢ãªããã£ãã·ã¥ããŸãã¯ããã»ããµã¬ãã«ã§ã®å§çž®ã¯å®éã«ã¯äœ¿çšãããŠããŸããã§ããã
ãããèµ·ãã£ãçç±ãšãå§çž®ãããŸããŸãªã¬ãã«ã§å©çšã§ããããã«ããããã«äœãã§ããããã€ãŸãããã£ãã·ã¥ã§çŽæ¥å§çž®ããæ¹æ³ã«ã€ããŠèª¬æããŸãã ãã£ãã·ã¥å§çž®-å§çž®ãããŒããŠã§ã¢ã§çŽæ¥è¡ãããããšãæå³ããŸããã€ãŸããããã»ããµãã£ãã·ã¥èªäœã®ããžãã¯ã®äžéšãå€æŽãããŸãã ãã®ããŒãã¹ã®ããŒãã¹ã«ã€ããŠç°¡åã«èª¬æããŸãã ã¡ã¢ãªå
ã®å§çž®ãã©ã®ãããªåé¡ããããã«ã€ããŠèª¬æããŸãã ããã¯ãŸã£ããåãããšã®ããã§ãããã¡ã¢ãªã«å¹æçã«å§çž®ãå®è£
ããããšã¯å®å
šã«ç°ãªãããã£ãã·ã¥ãšã¯ç°ãªããŸãã NVidiaãšã®é£æºã«ã€ããŠèª¬æããŸããNVidiaã§ã¯ãææ°ã®GPUåãã®å®éã®ããŒããŠã§ã¢ã§åž¯åå¹
å§çž®ãè¡ããæé©åã¯ææ°äžä»£ã®GPUã«ãŒãã§ããVoltã§è¡ãããŸããã ãããŠãå§çž®ããŒã¿ããŸã£ãã解åããã«çŽæ¥å®è¡ããå Žåã®å®å
šã«æ ¹æ¬çãªæé©åã«ã€ããŠèª¬æããŸãã
ãã£ãã·ã¥ã®å§çž®ã«é¢ããããã€ãã®èšèã ããã¯2012幎ã®PACTäŒè°ã§ã®èšäºã§ããããã®äœæ¥ã¯Intelãšå
±åã§è¡ãããŸããã 2 MBãŸãã¯4 MBã®ããã»ããµãã£ãã·ã¥ã4 MBãŸãã¯8 MBã«å€æŽããå Žåãäž»ãªåé¡ãäœã§ããããæ確ã«ããããã ããªãã¯Lå§çž®ãããŸããããåé¡ã¯äœã§ããïŒ
倧ãŸãã«èšããšãã¡ã¢ãªã¢ã¯ã»ã¹æäœãããå Žåãx86ã¢ãŒããã¯ãã£ã«ã€ããŠè©±ããŠããå Žåãã¡ã¢ãªã«ããŒããŸãã¯ã¹ãã¢ããã¬ãžã¹ã¿ã«ããŒã¿ããªãå Žåã¯ã1次ãã£ãã·ã¥ã«ç§»åããŸãã éåžžããããã¯ææ°ã®ããã»ããµã§ã¯3ãŸãã¯4ã¯ããã¯ãµã€ã¯ã«ã§ãã ããŒã¿ãããå Žåã¯ãCPUã«æ»ããŸãã ååšããªãå Žåãã¡ã¢ãªãªã¯ãšã¹ãã¯éå±€ãããã«é²ã¿ãL2ãã£ãã·ã¥ã«å°éããŸãã
ã»ãšãã©ã®Intelããã»ããµã®L2ãã£ãã·ã¥ã«ã¯ããã£ãã·ã¥ã®ãµã€ãºã«å¿ããŠ15ã20ã¯ããã¯ãµã€ã¯ã«ããããŸãã ãããŠãã¡ã¢ãªã«è¡ããªãã£ãå ŽåãããŒã¿ã¯éåžžãL2ãã£ãã·ã¥ã§èŠã€ãã£ãå Žåã«æ»ããŸãã ããŒã¿ã¯ããã«ããã»ããµã«éãããL1ãã£ãã·ã¥ã«ä¿åãããŸããçªç¶ãã®ããŒã¿ãåå©çšãç¶ããŠããã»ããµã«è¿ã¥ããšãããŒã¿ã¯ä¿åãããŸãã
åé¡ã¯ãããŒã¿ãå§çž®ãããŠããå Žåãå§çž®ããã»ã¹ãæé©åããæ¹æ³ã¯åé¡ã§ã¯ãªãã解åã¯åžžã«ã¯ãªãã£ã«ã«ã¹ã¿ãŒããã¹äžã«ãããšããããšã§ãã 2次ãã£ãã·ã¥ãžã®ä»¥åã®ã¢ã¯ã»ã¹ã«15ãµã€ã¯ã«ããã£ãå Žåãå§çž®è§£é€ã«é¢é£ããé
延ã¯ãªã¯ãšã¹ãã®é
延ã«è¿œå ãããŸãã ãŸãããã®å¶éã¯ãã¡ã¢ãªå
ãšããã¥ãŒã©ã«ãããã¯ãŒã¯ã®ãã¬ãŒãã³ã°ãªã©ã®å®éã®ã¢ããªã±ãŒã·ã§ã³ã«é©çšããå Žåã®äž¡æ¹ã§ãã»ãšãã©ãã¹ãŠã®å§çž®ã¢ããªã±ãŒã·ã§ã³ã«åœãŠã¯ãŸããŸãã解åã¯åžžã«ã¯ãªãã£ã«ã«ãã¹äžã«ããããã®é
延ãå®è¡ã«ãããæéã¯éåžžã«éèŠã§ãã
ããã¯ç§ãã¡ã«ãšã£ãŠäœãæå³ããã®ã§ããããïŒ ãã£ãã·ã¥ã¬ã€ãã³ã·ã15ãµã€ã¯ã«ã®ã¬ãã«ã§ããããšãç解ããŠããå Žåã¯ã解åãéåžžã«æé©åããå¿
èŠããããŸãã ãããæ°ããã»ããµãµã€ã¯ã«ã§ååã§ãã ãããã©ãã»ã©å°ããããç解ããããã«ã1ã€ã®ãã©ã¹èšå·ã¯çŽ2ã€ã®æªçœ®ãåããŸãã ã€ãŸããéåžžã«è€éãªããšã¯ã§ããŸããã
ãããäž»ãªçç±ã§ãIntelã¯ããæç¹ã§ãã£ãã·ã¥å§çž®ã®éçºãåæ¢ããŸããã 圌ãã¯ããã§åããã°ã«ãŒãå
šäœãæã£ãŠããŠã2005-2006幎ã«åœŒãã¯çŽ5ãµã€ã¯ã«ã解åããã¢ã«ãŽãªãºã ãéçºããŸããã ãã®é
延ã¯çŽ30ïŒ
å¢å ããŸãããããã£ãã·ã¥ã¯ã»ãŒ2åã«ãªããŸããã ãã ããèšèšè
ã¯ã»ãšãã©ã®ã¢ããªã±ãŒã·ã§ã³ãèŠãŠãé«äŸ¡ããããšèšããŸããã
ç§ã2011幎ã«ãã®ãããã¯ã«åãçµã¿å§ãããšãã圌ãã¯ããªãã1-2ã¹ãããã§äœããããããšãã§ããã°ããããè©Šãããã«å®éã®ããŒããŠã§ã¢ã§è¡ãããšãã§ãããšèšããŸããã
ããŸããŸãªã¢ã«ãŽãªãºã ãè©ŠããŸãããããã§ã«æç®ã§å©çšå¯èœãªã¢ã«ãŽãªãºã ã䜿çšã§ããªãã£ãçç±ã®1ã€ã¯ããããããã¹ãŠãœãããŠã§ã¢ã§äœæãããŠããããšã§ãã ãœãããŠã§ã¢ã«ã¯ä»ã®å¶éãããã人ã
ã¯ããŸããŸãªèŸæžãªã©ã䜿çšããŸãã ãããã®ææ³ãå®éã®ããŒããŠã§ã¢ã§äœæããããšãããšãåäœãéåžžã«é
ããªããŸãã IBMã¯ãLempel-Zivã¢ã«ãŽãªãºã ãgzipãšå®å
šã«åãã«ããå®å
šã«ããŒããŠã§ã¢ã«ããŸããã解åã«ã¯64ãµã€ã¯ã«ããããŸããã ãã£ãã·ã¥ã§ã¯ããã䜿çšãããã¡ã¢ãªå
ã§ã®ã¿äœ¿çšããããšã¯æããã§ãã
ç§ã¯æŠç¥ãå€ããããšããŸããã ãœãããŠã§ã¢ã¢ã«ãŽãªãºã ãæé©åããããšãã代ããã«ããã£ãã·ã¥ã«ä¿åãããŠããå®éã®ããŒã¿ã確èªãããã®ããŒã¿ã«é©ããã¢ã«ãŽãªãºã ãäœæããããšã«ããŸããã
é説çã«ã20ïŒ
ãã30ïŒ
ãŸã§å€ãã®ãŒããããããšãããããŸããã Intelã®ã¢ããªã±ãŒã·ã§ã³ã®å€§ããªããã±ãŒãžã䜿çšããå Žåãèšç®ã«äœ¿çšãã200ã®ç°ãªãã¢ããªã±ãŒã·ã§ã³ããããŸã-å€ãã®ãŒãã ããã¯åæåã§ãããå€æ°ã®ãŒããæã€è¡åã§ããããããã¯ãã«ãã€ã³ã¿ãŒã§ãã ããã«ã¯å€ãã®çç±ããããŸãã
å€ãã®å Žåãå€ãéè€ããŠããŸãã ãã£ãã·ã¥å
ã®å°ããªã¡ã¢ãªé åã§éåžžã«å°ããªå€ãç¹°ãè¿ãããšãã§ããŸãã ããã¯ãããšãã°ãã°ã©ãã£ãã¯ã¹ã䜿çšããŠããå Žåããã¯ã»ã«ã®æããããåãè²ã®ç»åã®äžéšãããå Žåãè¡ã®ãã¹ãŠã®ãã¯ã»ã«ã¯åãã«ãªããŸãã ãŸãããããŒå€ã¯ã2ãã€ãã4ãã€ããããã³8ãã€ãã«æ ŒçŽãããã·ã³ã°ã«ãã€ãããã³ããã«ãã€ãã®å€ã§ãã ãªããããèµ·ãã£ãŠããã®ã§ããïŒããã¯èª°ã®ééãã§ããïŒ ãã®ãããªåé·æ§ã¯ã©ãããæ¥ãã®ã§ããããïŒ
åé·æ§ã¯ãã³ãŒãã®ããã°ã©ãã³ã°æ¹æ³ã«é¢é£ããŠããŸãã C ++ãªã©ãããçš®ã®èšèªã䜿çšããŸãã ããšãã°ãé
åå
šäœãªã©ã®ãªããžã§ã¯ãã«ã¡ã¢ãªãå²ãåœãŠããå Žåãé
åå
ã®ããã€ãã®ã€ãã³ãã«é¢ããçµ±èšæ
å ±ãä¿åãããšããŸãããããã®ã€ãã³ãã¯éåžžã«é »ç¹ã«çºçããå¯èœæ§ããããŸãã ããšãã°ãç¹å®ã®åœä»€ã䜿çšããã¡ã¢ãªã¢ã¯ã»ã¹ã ã»ãšãã©ã®åœä»€ã¯ã¢ã¯ã»ã¹ãããŸããããèµ·åäžã«äœåååãã¢ã¯ã»ã¹ãããåœä»€ããããŸãã
ããã°ã©ãã¯ãææªã®å ŽåãæŽæ°å€ã倧ããªå€ããšãããšãããããã8ãã€ãã®æ°å€ã®é
åãå²ãåœãŠãå¿
èŠããããŸãã ãã ããããã¯åé·ã§ãã ãããã®å€ã®å€ãã¯å®éã«ã¯å¿
èŠã§ã¯ãªããäžå®å
šãªãŒãããããããããŸãããå
è¡ãŒãã®äžéšãå
ã«ãããŸãã
ããã«ãããŸããŸãªã¿ã€ãã®åé·æ§ãæã€ä»ã®å€ãã®å€ããããŸãã ããšãã°ããã€ã³ã¿ãŒã äžåºŠã³ãŒãããããã°ããŠãã€ã³ã¿ãŒãèŠã人ã¯ãããããéåžžã«å€§ããå€åããŠããããšã«æ°ä»ãã§ãããã ãã ããã»ãŒåãã¡ã¢ãªé åãæã€ãã€ã³ã¿ãŒãããå Žåãã»ãšãã©ã®ãããã¯åãã«ãªããŸãã ãã®ã¿ã€ãã®åé·æ§ãæããã§ãã
ç§ã¯å€ãã®çš®é¡ã®åé·æ§ãèŠãŸããã æåã®è³ªåã¯ãããã€ããããšããããšã§ãã
ããã¯ã2次ãã£ãã·ã¥ããå®æçã«ããŒã¿ãååŸãããã®ãã£ãã·ã¥ã®ã¹ãããã·ã§ãããä¿åããŠããŒããããã€ãããã調ã¹ãŠãå€ãç¹°ãè¿ãããå®éšã§ãã X軞ã§ã¯ãã³ã³ãã¥ãŒã¿ãŒã¢ãŒããã¯ãã£ã§ç©æ¥µçã«äœ¿çšãããŠããSPEC2006ããã±ãŒãžã®ããŸããŸãªã¢ããªã±ãŒã·ã§ã³ãšãIntelã®ä»ã®ããŸããŸãªã¢ããªã±ãŒã·ã§ã³ã®äž¡æ¹ããããŒã¿ããŒã¹ãšApachiãµãŒããŒãªã©ã®ããŸããŸãªWebã¯ãŒã¯ãããŒã®äž¡æ¹ã§ãã ãããŠãããã¯2ã¡ã¬ãã€ãã®L2ãã£ãã·ã¥ã§ãããšããä»®å®ã§ãã
ç°ãªãã¢ããªã±ãŒã·ã§ã³ã®åé·æ§ã«ã¯å€§ããªã°ãã€ããããããšã«æ°ã¥ããããããŸãããããããã®éåžžã«åçŽãªãã¿ãŒã³ã§ããéåžžã«äžè¬çã§ãã ãã¹ãŠã®ãã£ãã·ã¥ã©ã€ã³ã®43ïŒ
ããã£ãã·ã¥ã«æ ŒçŽãããŠãããã¹ãŠã®ããŒã¿ãã«ããŒããã®ã¯ãããã ãã§ãã
ãããã®ãã¿ãŒã³ãšä»ã®ãã¿ãŒã³ãã«ããŒããåªããå§çž®ããã©ãŒãã³ã¹ãæäŸããååã«ã·ã³ãã«ãªãã®ãèãåºãããšãã§ããŸãã
ããããåé¡ã¯ãããã®ãã¿ãŒã³ãäœãé¢é£ãããã®ãïŒ ãããã®ãã¿ãŒã³ã®ããããã«ç¹ã«æ©èœããäœããã®çš®é¡ã®å§çž®ã¢ã«ãŽãªãºã ãäœæããå¿
èŠããããŸããããããšãå
±éç¹ããããŸããïŒ
芳å¯ã®èãæ¹ã¯äžè¬çã§ããã ãããã®å€ã¯ãã¹ãŠã倧ããå Žåãå°ããå ŽåããããŸãããäž¡è
ã®éãã¯ã»ãšãã©ãããŸããã 倧ãŸãã«èšã£ãŠãç¹å®ã®åãã£ãã·ã¥ã©ã€ã³ã®å€ã®ãã€ãããã¯ã¬ã³ãžã¯éåžžã«å°ããã§ãã ãŸãããã£ãã·ã¥ã«æ ŒçŽãããŠããå€ãæ³åã§ããŸããããšãã°ã32ãã€ãã®ãã£ãã·ã¥ã©ã€ã³ã¯ãBase + Delta Encodingã䜿çšããŠç°¡åã«è¡šãããšãã§ããŸãã
ããšãã°ãæåã®å€ãããŒã¹ãšããä»ã®ãã¹ãŠããã®ããŒã¹ããã®ãªãã»ãããšããŠæ瀺ããŸãã ãŸããã»ãšãã©ã®å Žåãäºãã®å€ã¯ããã»ã©å€ãããªãããããã«ã¿ã¯1ãã€ãã«é
眮ããã32ãŸãã¯64ãã€ãã§ã¯ãªãã12ãã€ãã ãã§ååã§ãããçŽ20ãã€ãã®ã¹ããŒã¹ãç¯çŽã§ããŸãã
ãããå®éã®ããŒããŠã§ã¢ã«å®è£
ããæ¹æ³ã®è©³çŽ°ã«ã€ããŠã¯èª¬æããŸããã å®éã®ãããã¿ã€ããäœæããVerilogã§ãããã¿ã€ããäœæããææ°ã®FPGAã§ãããã¿ã€ããäœæããå®è£
ã«ã€ããŠIntelãšè©±ããŸããã ãã®ã¢ã€ãã¢ã«åºã¥ããŠã¢ã«ãŽãªãºã ãäœæããããšãã§ããŸããããã¯ã解åã«1ã€ãŸãã¯2ã€ã®ã¯ããã¯ãµã€ã¯ã«ã®ã¿ãå¿
èŠãšããŸãã ãã®ã¢ã«ãŽãªãºã ã¯é©çšå¯èœã§ãããè¯å¥œãªå§çž®ãæäŸããŸã...
ãã£ãã·ã¥ã§äœ¿çšãããæé«ã®ä»¥åã®äœåã¯ãè¿œå ã¹ããŒã¹ã®çŽ50ïŒ
ãæäŸããŸããã ããã¯çŽç²ãªå§çž®ã§ã¯ãããŸãã-ã¯ããã«å€ããäžããããšãã§ããŸã-ããã¯å¹æçãªå§çž®ã®æ¬åœã®ããŒãã¹ã§ããã€ãŸãããŠãŒã¶ãŒã«ãšã£ãŠãã£ãã·ã¥ãã©ãã ãèŠãããã§ãã æçåãªã©ãããããçš®é¡ã®åé¡ã«å¯ŸåŠããå¿
èŠããããŸãã
ã€ã³ãã«ãæã£ãŠããæé«ã®ä»¥åã®ã¡ã«ããºã ã®ã¬ãã«ã§å§çž®ãè¡ãããŠããŸãããã¹ã©ã€ãã®éäžã§ã®äž»ãªå©ç¹ã¯å§çž®è§£é€ã§ãã 以åã®ã¢ã«ãŽãªãºã ã§ã¯ãæé«ã®å§çž®è§£é€ã¯5ã9ãµã€ã¯ã«ã§ããã å§çž®ã¯éåžžã«å¹æçã§ããã1ã2ãµã€ã¯ã«ã§å®è¡ã§ããŸããã
ãã®çš®ã®ã¢ã«ãŽãªãºã ã¯ãå®éã®ããŒããŠã§ã¢ã§å®è¡ããã¡ã¢ãªãªã©ã®ãã£ãã·ã¥ã§äœ¿çšã§ããŸãã
ãã®ãããªæé©åããã£ãã·ã¥ã«é©çšãããšããã£ãã·ã¥ã¯ãŠãŒã¶ãŒã«ãšã£ãŠã»ãŒ2åã®å¹æãããããã«èŠããããšãå€ããªããŸãã ããã¯ã©ãããæå³ã§ããïŒ ææ°ã®ããã»ããµãåçã§èŠããšãã³ã¢èªäœã¯ã»ãšãã©ãããŸããã ããã»ããµãã£ãã·ã¥ã¯ããã®å€§éšåãå ããŠããŸããIBMãšIntelã®äž¡æ¹ã«ãšã£ãŠ40ã50ïŒ
ã¯ç°¡åã§ãã å®éãIntelã¯åã«ãã£ãã·ã¥ãååŸããŠ2åã«ããããšã¯ã§ããŸãããåã«ãã£ãã·ã¥ãè¿œå ããäœå°ã¯ãããŸããã ãããŠãã«ãŒãã«èªäœã®æ°ããŒã»ã³ãã®ã³ã¹ãããããããªããã®ãããªæé©åã¯ããã¡ããéåžžã«èå³æ·±ããã®ã§ãã
2çªç®ã®äœæ¥ã§ã¯ããŸããŸãªæé©åãè¡ããŸããããä»æ¥ã¯èª¬æããŸãããããã£ãã·ã¥ã©ã€ã³ã®ãµã€ãºãå€æŽã§ããããã«ãªããŸããã ãããã®åé¡ã¯ãã¹ãŠæ£åžžã«è§£æ±ºãããŸããã
Intelã§ãè¡ããã3çªç®ã®äœæ¥ãã¡ã¢ãªã®å§çž®æ¹æ³ã«ã€ããŠã話ããããšæããŸãã
ããã§ã®åé¡ã¯äœã§ããïŒ
äž»ãªåé¡ã¯ãLinuxãŸãã¯Windowsã«4 KBã®ã¡ã¢ãªããŒãžãããå Žåããããå§çž®ããããã«æ¬¡ã®åé¡ã解決ããå¿
èŠãããããšã§ãããã®ããŒãžã®ããŒã¿ã¢ãã¬ã¹ãã©ã®ããã«å€åããããšããåé¡ã解決ããå¿
èŠããããŸãã æåã¯4 KBã§ããã®äžã®åãã£ãã·ã¥ã©ã€ã³ã64ãã€ãã§ãã ãŸãããã®ã¡ã¢ãªããŒãžå
ã®ãã£ãã·ã¥ã©ã€ã³ã®ãªãã»ãããèŠã€ããã®ã¯ç°¡åã§ãã64ãåããå¿
èŠãªãªãã»ãããæããŸãã
ãã ããå§çž®ãé©çšãããšãåè¡ã®ãµã€ãºãç°ãªãå¯èœæ§ããããŸãã ãã£ãã·ã¥ã®ã¡ã¢ãªããããŒãžãèªã¿èŸŒãå¿
èŠãããå Žåããããã®ãªãã»ããã¯ãããŸããã
ã©ããã«ä¿åã§ãããšèšããŸãã ãããŠãããããã©ãã«ä¿åããŸããïŒ åã³ã¡ã¢ãªãŸãã¯ãã£ãã·ã¥ã«ä¿åããŸãã ãã ããåã¡ã¢ãªã®ãã¹ãŠã®ãªãã»ãããä¿åããå Žåããã£ãã·ã¥ã¯ãããŸããããã¹ãŠã®ã¡ã¢ãªãåŠçããã«ã¯ãæ°çŸMBã®ãªãœãŒã¹ãå¿
èŠã§ãã ãã®ããŒã¿ããããã«ä¿åããããšã¯ã§ããŸãããããã¹ãŠã®ã¡ã¢ãªã¢ã¯ã»ã¹ã«ã¯è€æ°ã®ã¡ã¢ãªã¢ã¯ã»ã¹ããããããã¡ã¢ãªã«ä¿åããããšã¯æãŸãããããŸããã æåã«ã¡ã¿ããŒã¿ãååŸãã次ã«å®éã®ããŒã¿ãååŸããŸãã
OSã§ã®äœæ¥äžã«èª°ããééãã2çªç®ã®åé¡ã¯ãããŒã¿ã®æçåã§ããã ããã§ã¯ãåããŒãžã®ã¡ã¢ãªå
ã®å Žæãç°ãªããããéåžžã«å°é£ã«ãªããŸãã ããã«ãä»®æ³ã¢ãã¬ã¹ç©ºéã§ã¯ããã¹ãŠã®ããŒãžã4 KBã®ãŸãŸã§ãããå§çž®åŸããããã¯ãã¹ãŠå®å
šã«ç°ãªããµã€ãºãå æããŸãã ãããŠãããã¯åé¡ã§ãããã®ç©ºã®å Žæãã©ã®ããã«äœ¿çšã§ããã®ã§ããããã OSã¯ãããŒãžãå°ããããããšãèš±å¯ããããšãèªèããŠããŸãããæçåããããããã®æçã¯è¡šç€ºãããŸããã ãã®ããã«ãäœãå€æŽããªãéããå§çž®ããŒãã¹ã¯åŸãããŸããã
ãã®åé¡ã解決ããããã«äœãææ¡ããŸãããïŒ ç·åœ¢ä¿æ°ã䜿çšããå§çž®ã èšäºã§è©³ãã説æãããŠããäžé£ã®å¶éã課ããŸãããããã€ã³ãã¯ãã¡ã¢ãªã«å§çž®ãé©çšããå Žåããã®ããŒãžã®åãã£ãã·ã¥ã©ã€ã³ãç¹å®ã®ä¿æ°ã§å§çž®ãããããšãä¿èšŒããã¢ã«ãŽãªãºã ã䜿çšããããšã§ã4察1ã3察1ããŸãã¯ãŸã£ããå§çž®ããŸããã
è¿œå ã®å¶éã課ããããããŒã¿å§çž®ã®é¢ã§äœãã倱ãå¯èœæ§ããããŸãããäž»ãªããŒãã¹ã¯ãèšèšãéåžžã«åçŽã§ãããå®è£
ã§ããããšã§ããããã¯ãããšãã°Linuxã§æåããŸããã
ç§ãã¡ãææ¡ããææ³ã§ããç·åœ¢å§çž®ããŒãžïŒLCPïŒã¯ããã¹ãŠã®ããŒã¿ã®ã¢ãã¬ã¹ãéåžžã«åçŽã§ãããšããäž»ãªåé¡ã«å¯ŸåŠããŸãã åãããŒãžã«æ ŒçŽãããå°ããªã¡ã¿ããŒã¿ãããã¯ããããå§çž®ããã圢åŒã§æ ŒçŽãããå
ã®ããŒã¿ããŸãã¯ããããäŸå€ã¹ãã¬ãŒãžãšåŒã°ãããã£ãã·ã¥ã©ã€ã³ãæ ŒçŽãããŸããããã®æ¹æ³ã§ã¯å§çž®ã§ããŸããã
ãããã®ã¢ã€ãã¢ã«åºã¥ããŠãæã
ã¯åªããããŒã¿å§çž®ãåŸãããšãã§ããŸãããæãéèŠãªããšã¯ãäž»ã«IBMãè¡ã£ãæé«ã®ä»¥åã®äœåãšæ¯èŒããããšã§ãã
ç§ãã¡ã®å§çž®ã¯ããããã®å§çž®ãããã¯ããã«åªããŠããŸããã§ããããæãéèŠãªããšã¯ãäœåãªããã©ãŒãã³ã¹ãæ¯æãããšãªããããå€ãã®ã¡ã¢ãªãååŸããŸããã倧ãŸãã«èšãã°ãã¡ã¢ãªã¯å¢ããŸãããããã®ãããããã©ãŒãã³ã¹ã¯ããããªããäœäžããŸããããäœäžããŠããŸããããããŠããšãã«ã®ãŒå¹çã®æ倱ããããŠããã¹ãŠããã©ã¹ã«ããŠãããã©ãŒãã³ã¹ãåäžããããšãã«ã®ãŒã³ã¹ããåæžããŸããã
å®éã®ããŒããŠã§ã¢ã§ã®æåŸã®äœæ¥ã«ã€ããŠç°¡åã«èª¬æããŸãããããNvidiaãšã®ä»äºã§ããåé¡ã¯ãéä¿¡ãã£ãã«ãä»ããŠã¡ã¢ãªãããããã«ããŒã¿ã転éããéã®ãšãã«ã®ãŒå¹çã§ããã°ã©ãã£ãã¯ã¡ã¢ãªã«ãŒãã§ã¯ãã¡ã¢ãªèªäœã¯ã¯ããã«å°ãããªããŸããããã®ã¡ã¢ãªã®ã¹ã«ãŒãããã¯ã¯ããã«å€§ãããã»ãšãã©ã®ã°ã©ãã£ãã¯ã«ãŒãã§5ã6åã«ãªããŸããå€ãã®ã°ã©ãã£ãã¯ã¢ããªã±ãŒã·ã§ã³ã§ã¯ã倧éã®ããŒã¿ãå¹ççã«åŠçããããã«ãã®åž¯åå¹
ãå¿
èŠãšããããã§ãã
, Nvidia, 2014 , , , . , : , DVFS, , , , , , , .
â , , . , , . . , - , , , . Nvidia , .
, , . . , , , 0011. . 0101. , 0 0, , bit toggle, , , , , , 0 1 1 0. , . , . , .
â , . , , networks of chip DRM , , .
? ?
Nvidia, . , 32 16 , . ? , . , , XOR, 16 8 , , .
- , , frequent pattern compression, Intel, , . 15, 16 . , , , alignment. , . , . , Nvidia . , , , , . , 2%, 40%. , .
, . , , , , , , , , , , .
. , , , . , , , - , â , . , , . , , , , , .
, , , Microsoft Research , machine learning.
, , , , . DNNs. . , DNNs â , - Ajura, , , , . , 10%. , , , : TensorFlow, CNTK, MXNet. , , .
DNNs , , , GPU.
DNNs . , , , , . . , , , , . , .
, . , , inference.
, Google TPU , , , inferent , , Microsoft.
? ? Google, , TPU inference. , . , inference, . forward pass. , - , inference . â . , . backward pass, , , .
, , . , . , , feature maps activations. inference , . back propagation, , , , . , L2, , forward pass , , , , .
, , 100, 200 , , ResNet, image classification. - . â , , ResNet 101 , , , - mini batch , , 64 , , , 16 . , P100 Pascal Volt100, 16 . , .
, Facebook, . , , , . , .
, . GPU, . , .
, , , , , . X AlexNet, DNN , : Overfeat, VGG16, Inception ( Google version 3, , Google). mini batch. , , . , , ImageNet, 220 220 3, 1000 : , , .
? AlexNet , 2011 , . , , feature maps, .
? , , mini batch, , , inputs. ? GPU . GPU 8000 threads, , , GPU . , ? , CPU ISAC.
, . , . , , , , , , , DNN inference, , ISCA, MICRO, 2015 15 , deep neural networks, inference, .
-, inference, , forward pass, , , .
, , . , inference, - , , 32- floating point, , 16 , 8 4. , , - stochastic gradient descend, . , . GPU, ISAC, - TPU - FPGA . GPU .
, . , , , , , , , , , , , LX, . , , . . : .
, . - , . , . . . , . , . , . .
, , ? , . , , . , , , . , . - , .
, , : feature maps . , , , , . , , , Relu , rectified linear function pulling layer, , , .
, , . , . , , , Relu, pulling layer. layers, .
, Relu , , - â . â .
, Relu, . , . Relu â , , , 0, . â , , .
, , , by modal. . , , . , , . - , 32 , , , 31 . , , .
32 , . . , . , pulling layer â 2 2, , 2 2 3 3. , , , . TensorFlow, CNTK, , , . , , machine learning, , . . , , , - , , .
- , , , , . Relu 32 , pulling layer 8 . , , TensorFlow, . - . .
, 32 1 . , , . , . 32-, , . .
, lossless, .
, Relu . . , , VGG16, 10 , , â , . 1 100% , 0,6 â 60% . , .
, , Relu_1, Relu_2 , . , , 60%, 70%. , , , . , .
ããã¯ã©ãããæå³ã§ããïŒ , . , , GPU. CUDA , . , - CUDA . , NVidia , 99%. . , 50% 80% . , GPU . , , , CUDNN , Nvidia . , CUDA , .
, , , Lossy Encoding, . , IBM . , L1 - . 32 , 8 16, - , .
, 32- 16- , , , , , , AlexNet, 32 16 - , , , , validation. , , . , , , . , , , .
. , , , . , , . â . forward pass . , , backward pass, . backward pass, .
?
, . â AlexNet FP32, 32 . , IBM , All-FP16, 16 , . - 16 , , 16 , , , - .
, , , , 16 , 8 . , .
, , . , , .
, Gist, «». DNN, execution graph, , , , CNTK , Microsoft. CUDA . execution graph, , TensorFlow, MXNet, . GPU , , . , , .
, , , cuDNN, , . . . , TensorFlow. GIST .
, , ? . , , , , , , 90-100 , . . , , . 6% 8%, , , , . , , Microsoft.
, . , , Microsoft, PhD , , DNN Training, , , , , . . Microsoft , LSTM , speech , , . - -, .
. , , . image classification, AlexNet, ResNet , . Image classification â , , , Facebook. , machine learning .
, , , . , . , MXNet, Amazon, , , TensorFlow. ResNet 50, , «» , 40% , TensorFlow . , MXNet , TensorFlow? . . LSTM, MXNet , TensorFlow, - , Amazon Google, TensorFlow 2,5 . , . .
, . , . , , , , machine learning, .
, , Image Classification, , . object detection, , , Faster RCNNs, 16-17- Google MIT, . : LSTM, , , . nmt â Google, open source. sockeye Amazon. .
Transformer, Attention layers. 2013-, , , , , Attention layers. Google , GPU, , .
speech recognition, LSTM. adversarial networks, reinforcement learning, Google AlphaGo . supervise, n-supervise learning. image classification. , . 16-17- , .
, . TensorFlow, CNTK, MXNet. , , , . TensorFlow , - CNTK MXNet. , , , . , , .
. GPU. , CPU . , , GPU, FPGA, ISAC Google. , TensorFlow, .
: , , , , -. .
, , , , . , , , CNTK. . , . TensorFlow , , 2000 , , , , . . , CNTK, MXNet. TensorFlow , . , Google ⊠, , , , , .
, .
, TensorFlow MXNet . , NMT Google, Sockeye. , - , LSTM , batch, , .
, , , -, Y blue score. , , . , . .
? , , , blue score 20 . , , . SGD. , , - , .
- , learning rate , -, . . , , , , , , . , TensorFlow LSTM . CUDNN, MXNet LSTM. , .
, , TensorFlow, MXNet. , GPU, LSTM , 30% GPU . 70% . , , . TensorFlow 10-15%, . , MXNet. .
LSTM, , Nvidia CUDNN , LSTM . , . -, E , , CUDNN 2,5 LSTM. , «» P100 CUDNN 8 , «» 100 CUDNN 9 . . , . .
reinforcement learning, . , NIPS ICLR, . , machine learning MATLAB . , , . , , , . . Google, , .
, - , . .
reinforcement learning, MXNet TensorFlow, , .
, . , , , , , , . , , .
, , : , . ããã¯ã©ãããæå³ã§ããïŒ , , . , - , , . , . , Google , 5-7%.
cloud tax 20â25%, , , . , Google Cloud, 25% â , .
â . , - , ? , , , .
, . . , -, - . - , - . . , , , . , where value = 10, value . , .
, 4 .
, , , Base+Delta Encoding, - . , 1 , . .
, -. , , ? - , , . , , - , n . , , , . , Base+Delta Encoding . , , Base+Delta Encoding.
ããã¯ã©ãããæå³ã§ããïŒ , , . .
, , , , , , .
, 1 â , . , , â . , . . , CND- Intel . , 1 , . 4 8 .
, . . , , . , . .