Refereed Journal Papers

T. Kanamori, S. Fujiwara, A. Takeda
Robustness of Learning Algorithms using Hinge Loss with Outlier Indicators.
Neural Networks, to appear.

T. Takenouchi, T. Kanamori,
Statistical Inference with Unnormalized Discrete Models and Localized Homogeneous Divergences.
Journal of Machine Learning Research, to appear.

T. Kanamori, T. Takenouchi,
Graphbased Composite Local Bregman Divergences on Discrete Sample Spaces.
Neural Networks, to appear.

K. Matsui, W. Kumagai, T. Kanamori,
Parallel Distributed Block Coordinate Descent Methods based on Pairwise Comparison Oracle.
Journal of Global Optimization, to appear.

S. Fujiwara, A. Takeda, T. Kanamori,
DC Algorithm for Extended Robust Support Vector Machine.
Neural Computation, vol. 29, num. 5, May, 2017.

T. Kanamori, S. Fujiwara, A. Takeda
Breakdown Point of Robust Support Vector Machines.
Entropy, vol. 19, no. 2, 83, February 2017

T. Kanamori,
Efficiency Bound of Local ZEstimators on Discrete Sample Spaces.
Entropy, vol. 18, no. 7, 273, July 2016,

T. Kanamori, H. Fujisawa
Robust Estimation under Heavy Contamination using Unnormalized Models.
Biometrika, vol. 102, no. 3, pp. 559572, Sep. 2015.

A. Takeda, S. Fujiwara, T. Kanamori,
Extended Robust Support Vector Machine Based on Financial Risk Minimization.
Neural Computation, vol. 26, num. 11, pp. 25412569, Nov. 2014.

T. Kanamori and H. Fujisawa,
Affine Invariant Divergences associated with Proper Composite Scoring Rules and their Applications.
Bernoulli, vol. 20, No. 4, pp. 22782304, Nov. 2014

T. Kanamori and A. Takeda,
A Numerical Study of Learning Algorithms on Stiefel Manifold.
Computational Management Science, vol. 11, Issue 4, pp 319340, Oct. 2014.

A. Takeda, T. Kanamori,
Using Financial Risk for Analyzing Generalization Performance of Machine Learning Models.
Neural Networks, vol. 57, pp. 2938, Sep, 2014.

T. D. Nguyen, M. C. du Plessis, T. Kanamori, M. Sugiyama,
Constrained LeastSquares DensityDifference Estimation.
IEICE Transactions on Information and Systems, vol. E97D, no. 7, pp. 18221829, July, 2014.

T. Kanamori,
ScaleInvariant Divergences for Density Functions.
Entropy, vol 16(5), pp. 26112628, May 2014.

T. Kanamori and M. Sugiyama,
Statistical Analysis of Distance Estimators with Density Differences and Density Ratios.
Entropy, vol. 16 (2), pp. 921942, Feb. 2014.

T. Kanamori, A. Ohara,
A Bregman extension of quasiNewton updates II: analysis of robustness properties.
Journal of Computational and Applied Mathematics, vol. 253, pp. 104122, Dec. 2013.

M. Sugiyama, T. Kanamori, T. Suzuki, M. C. du Plessis, S. Liu, I. Takeuchi,
Density Difference Estimation.
Neural Computation, vol. 25(10), pp. 27342775, Oct. 2013.

T. Kanamori, A. Takeda, T. Suzuki,
Conjugate Relation between Loss Functions and Uncertainty Sets in Classification Problems.
Journal of Machine Learning Research, vol. 14, pp. 14611504, June, 2013.

M. Sugiyama, S. Liu, M. C. du Plessis, Y. Yamanaka, M. Yamada, T. Suzuki, T. Kanamori,
Direct Divergence Approximation between Probability Distributions and Its Applications in Machine Learning.
Journal of Computing Science and Engineering, vol. 7, no. 2, pp.99111, June, 2013.

M. Yamada, T. Suzuki, T. Kanamori, H. Hachiya, M. Sugiyama,
Relative DensityRatio Estimation for Robust Distribution Comparison.
Neural Computation, vol. 25, No. 5, pp. 13241370, May 2013.

M. Kawakita, T. Kanamori,
SemiSupervised Learning with DensityRatio Estimation.
[arXiv]
Machine Learning, Volume 91, Issue 2, pp 189209, May 2013.

T. Kanamori,
Statistical Models and Learning Algorithms for Ordinal Regression Problems.
Information Fusion, vol. 14, issue 2, pp. 199207, April 2013.

T. Kanamori, T. Takenouchi,
Improving LogitBoost with Prior Knowledge.
Information Fusion, vol. 14, issue 2, pp. 208219, April 2013.

A. Takeda, H. Mitsugi, T. Kanamori,
A Unified Classification Model Based on Robust Optimization.
Neural Computation,
Vol. 25, No. 3, Pages 759804, March 2013.

T. Kanamori, T. Suzuki, M. Sugiyama,
Computational Complexity of KernelBased DensityRatio Estimation: A Condition Number Analysis.
[arXiv]
Machine Learning, vol. 90, issue 3, pp. 431460, March 2013.

T. Kanamori, A. Ohara,
A Bregman Extension of quasiNewton updates I: An Information Geometrical framework.
[arXiv]
Optimization Methods and Software, vol. 28, issue 1, pp. 96123, February 2013.

M. Sugiyama, T. Suzuki, T. Kanamori,
Densityratio matching under the Bregman divergence: A unified framework of densityratio estimation.
[site]
Annals of the Institute of Statistical Mathematics, vol. 64, no. 5, pp. 10091044, October 2012.

T. Kanamori, H. Uehara, M. Jimbo,
Pooling Design and Bias Correction in DNA Library Screening.
[arXiv]
Journal of Statistical Theory and Practice, vol. 6, issue 1, pp. 220238, March 2012.

T. Kanamori, T. Suzuki, M. Sugiyama,
Statistical analysis of kernelbased leastsquares densityratio estimation.
[site]
Machine Learning, vol. 86, Issue 3, pp. 335367, March 2012.

T. Kanamori, T. Suzuki, M. Sugiyama,
fdivergence estimation and twosample homogeneity test under semiparametric densityratio models.
[arXiv]
IEEE Transactions on Information Theory, Vol. 58, Issue 2, pp. 708720, February 2012.

T. Kanamori, A. Takeda,
WorstCase Violation of Sampled Convex Programs for Optimization with Uncertainty.
[arXiv]
Journal of Optimization Theory and Applications,
vol. 152, Issue 1, pp.171197, January 2012.

H. Shimodaira, T. Kanamori, M. Aoki, K. Mine,
Multiscale Bagging and its Applications.
[site]
IEICE Transactions on Information and Systems, Volume E94D No.10, pp.19241932, October 2011.

M. Sugiyama, T. Suzuki, Y. Itho, T. Kanamori, M. Kimura,
LeastSquares TwoSample Test.
[site]
Neural Networks, vol. 24, issue 7, pp. 735751, September, 2011.

S. Hido, Y. Tsuboi, H. Kashima, M. Sugiyama, T. Kanamori,
Statistical Outlier Detection Using Direct Density Ratio
Estimation.
[site]
Knowledge and Information Systems, vol. 26, num. 2, pp. 309336,
August, 2011.

M. Sugiyama, M. Yamada, von Bunau P., T. Suzuki, T. Kanamori, M. Kawanabe,
Direct densityratio estimation with dimensionality reduction
via leastsquares heterodistributional subspace search.
[site]
Neural Networks, vol. 24, pp. 183198, March, 2011.

T. Kanamori,
Deformation of LogLikelihood Loss Function for Multiclass Boosting.
[site]
Neural Networks, vol. 23, issue 7, pp. 843864, May, 2010.

T. Kanamori, T. Suzuki, M. Sugiyama,
Theoretical Analysis of Density Ratio Estimation.
[site]
IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences,
vol. E93A, no. 4, pp. 787798, April, 2010.

M. Sugiyama, I. Takeuchi, T. Suzuki, T. Kanamori, H. Hachiya, D. Okanohara,
LeastSquares Conditional Density Estimation.
[site]
IEICE Transactions on Information and Systems, vol.E93D, no.3, pp.583594, March, 2010.

A. Takeda, T. Kanamori,
A Robust Approach Based on Conditional ValueatRisk Measure to Statistical Learning Problems.
[site]
European Journal of Operational Research, vol. 198, issue 1, pp. 287296, Oct., 2009.

M. Sugiyama, T. Kanamori, T. Suzuki,
Shohei Hido, Jun Sese, Ichiro Takeuchi, and Liwei Wang,
A Densityratio Framework for Statistical Data Processing.
[site]
IPSJ Computer Vision and Application. vol. 1, pp. 183208, Sep., 2009

T. Kanamori, S. Hido, M. Sugiyama,
A Leastsquares Approach to Direct Importance Estimation.
[site]
Journal of Machine Learning Research. 10(Jul):13911445, July, 2009.

I. Takeuchi, K. Nomura, T. Kanamori,
Nonparametric Conditional Density Estimation Using PiecewiseLinear Path Following for Kernel Quantile Regression.
[site]
Neural Computation, vol. 21, num. 2, pp. 533559, Feb., 2009.

T. Suzuki, M. Sugiyama, T. Kanamori, and J. Sese,
Mutual information estimation reveals global associations between stimuli and biological processes.
[site]
BMC Bioinformatics, vol. 10, no. 1, pp.S52, Jan., 2009.

T. Takenouchi, S. Eguchi, N. Murata, T. Kanamori,
Robust Boosting Algorithm against Mislabelling in MultiClass Problems.
[site]
Neural Computation, vol. 20, num. 6, pp. 15961630, June, 2008.

T. Kanamori,
Multiclass Boosting Algorithms for Shrinkage Estimators of Class Probability.
[site]
IEICE Transactions on Information and Systems, Vol.E90D, No.12, pp. 20332042, Dec., 2007.

T. Kanamori,
Poolbased Active Learning with Optimal Sampling Distribution and its Information Geometrical Interpretation.
[site]
Neurocomputing, Vol. 71, Issue 13, pp. 353362, Dec., 2007.

T. Kanamori, , T. Takenouchi, S. Eguchi, N. Murata,
Robust Loss Functions for Boosting.
[site]
Neural Computation, 19(8), pp. 21832244, Aug., 2007

T. Kanamori, T. Takenouchi, N. Murata,
Geometrical Structure of Boosting Algorithm.
New Generation Computing, Tutorial Series on BrainInspired Computing, Part 6, 25(1):117141, Nov., 2006.

T. Kanamori, and I. Takeuchi,
Conditional Mean Estimation under Asymmetric and Heteroscedastic Error by Linear Combination of Quantile Regressions.
Computational Statistics and Data Analysis, Vol 50, Issue 12, pp 36053618, Aug., 2006.

N. Murata, T. Takenouchi, T. Kanamori, S. Eguchi,
Information Geometry of UBoost and Bregman Divergence.
Neural Computation, 16(7):14371481, July 2004.

T. Kanamori, H. Shimodaira,
Active Learning algorithm using the maximum weighted loglikelihood estimator
Journal of Statistical Planning and Inference, Vol. 116, Issue 1, pp. 149162, Sep., 2003.

I. Takeuchi, Y. Bengio, T. Kanamori,
Robust Regression with Asymmetric HeavyTail Noise distributions.
Neural Computation, Vol. 14, Num. 10, pp. 24692496, Oct., 2002.

T. Kanamori,
Statistical Asymptotic Theory of Active Learning.
Annals of the Institute of Statistical Mathematics,
Vol. 54, Num. 3, pp. 459475, Sep., 2002.

T. Kanamori, H. Shimodaira,
An Active Learning Algorithm Using an Information Criterion for the Maximum Weighted Loglikelihood Estimator.
Proceedings of the Institute of Statistical Mathematics, Vol, 48, No. 1, 197212, June, 2000.

T. Kanamori,
Active Learning Algorithm using Maximum Weighted Likelihood Estimator.
Bulletin of the Computational Statistics in Japan, vol. 11, Num. 2, pp. 6575, Oct., 1998.
International Conference

Takenouchi T, Kanamori T.
Empirical Localization of Homogeneous Divergences on Discrete Sample Spaces.
The Neural Information Processing Systems (NIPS 2015), poster & spotlight, to appear.

Kanamori T.
Legendre Transformation in Machine Learning.
Workshop: Information Geometry for Machine Learning, December 2014.

Fujisawa, H., Kanamori T.
Affine invariant divergences with applications to robust statistics.
The 7th International Conference of the ERCIM WG on Computational
and Methodological Statistics (ERCIM 2014), the University of Pisa, Italy, 68 December 2014.

Kanamori T., Fujisawa, H.
Affine Invariant Divergences and their Applications.
The 3rd Institute of Mathematical Statistics, Asia Pacific Rim Meeting, June 29July 3, 2014.

Sugiyama M., Kanamori T., Suzuki T., Plessis M., Liu S., Takeuchi I.
DensityDifference Estimation.
The Neural Information Processing Systems (NIPS 2012), Lake Tahoe, Nevada, United States, 38 Dec., 2012.

Kanamori T., Takeda A.
NonConvex Optimization on Stiefel Manifold and Applications to Machine Learning.
The 19th International Conference on Neural Information Processing (ICONIP 2012), Doha, Qatar, 1215 Nov., 2012.

Takeda A. Kanamori T., Mitsugi H.
Robust optimizationbased classification method.
The 21st International Symposium on Mathematical Programming
(ISMP 2012), Berlin, Germany, 1924 Aug., 2012.

Kanamori T., Suzuki, T., Sugiyama, M.
fdivergence estimation and twosample test under semiparametric density ratio models.
The 2nd Institute of Mathematical Statistics, Asia Pacific Rim Meeting (imsAPRM 2012),
Tsukuba, Japan, 24 July, 2012.

Takeda, A., Mitsugi, H., Kanamori, T.
A Unified Robust Classification Model.
29th International Conference on Machine Learning (ICML2012),
Edinburgh, Scotland, Jun. 26Jul. 1, 2012.

Kanamori, T., Takeda, A., Suzuki, T.
A Conjugate Property between Loss Functions and Uncertainty Sets in Classification Problems.
25th International Conference on Learning Theory (COLT2012),
Edinburgh, Scotland, Jun. 25Jun. 27, 2012.

Yamada, M., Suzuki, T., Kanamori, T., Hachiya, H., & Sugiyama, M.
Relative densityratio estimation for robust distribution comparison.
Presented at Neural Information Processing Systems (NIPS2011), Granada, Spain, Dec. 1315, 2011

Shimodaira H.Kanamori T., Masayoshi A., Kouta Mine
Multiscale Bagging with Applications to Classification and Active Learning.
The 2nd Asian Conference on Machine Learning, Nov. 2010.

A. Masayoshi, Kanamori T., Shimodaira H
Multiscalebagging with Applications to Classification.
The 2nd Asian Conference on Machine Learning, Nov. 2010.

Kanamori T. and Ohara Atsumi
A Bregman extension of quasiNewton updates.
Information Geometry and its Applications, Germany, Aug. 2010.

Sugiyama, M., Takeuchi, I., Kanamori, T., Suzuki, T., Hachiya, H., & Okanohara, D.,
Conditional density estimation via leastsquares density ratio estimation.
In Proceedings of Thirteenth International Conference on Artificial
Intelligence and Statistics (AISTATS2010), JMLR Workshop and
Conference Proceedings, vol.9, pp.781788, Sardinia, Italy, May 1315,
2010.

Sugiyama, M., Hara, S., von Bünau, P., Suzuki, T., Kanamori, T., & Kawanabe, M.,
Direct density ratio estimation with dimensionality reduction.
In S. Parthasarathy, B. Liu, B. Goethals, J. Pei, and C. Kamath
(Eds.), Proceedings of the 10th SIAM International Conference on Data
Mining (SDM2010), pp.595606, Columbus, Ohio, USA, Apr. 29May 1,
2010.

T. Kanamori
Efficient direct importance estimation for covariate shift
adaptation and outlier detection.
The 1st Institute of Mathematical Statistics, Asia Pacific Rim
Meeting, Seoul, June 28July 1, 2009

T. Kanamori, T. Suzuki, M. Sugiyama,
Condition Number Analysis of Kernelbased Density Ratio Estimation.
ICML workshop on Numerical Mathematics in Machine Learning, Montreal Canada, June 2009.

Suzuki, Sugiyama, Kanamori, Sese,
Mutual Information Estimation Reveals Global Associations
between Stimuli and Biological Process.
The 7th Asia Pacific Bioinformatics Conference (APBC2009)
Beijing, China, 1316 January 2009.
 Kanamori, T.
A Leastsquares Approach to Direct Importance Estimation and its Applications..
Joint Session of the CSA, JSS and KSS at 2008 Statistical Symposium,
China, Taipei, Dec. 19, 2008.

Hido, S., Tsuboi, Y., Kashima, H., Sugiyama, M., Kanamori, T..
Inlierbased outlier detection via direct density ratio estimation.
In xxx and xxx (Eds.), Proceedings of IEEE International Conference on
Data Mining (ICDM2008), pp.xxxxxx, Pisa, Italy, Dec. 1519, 2008.

Takafumi Kanamori, Masashi Sugiyama, and Shohei Hido
Efficient Direct Density Ratio Estimation for Nonstationarity
Adaptation and Outlier Detection.
NIPS, 2008

Taiji Suzuki, Masashi Sugiyama, Jun Sese, and Takafumi Kanamori.
A leastsquares approach to mutual information estimation
with application in variable selection.
Workshop on New Challenges for Feature Selection in Data Mining
and Knowledge Discovery 2008 (FSDM2008), Antwerp, Belgium,
Sep. 15, 2008

Suzuki, T., Sugiyama, M., Sese, J. and Kanamori, T.
Approximating mutual information by maximum likelihood density ratio estimation.
In xxx and yyy (Eds.), Proceedings of the Workshop on New Challenges
for Feature Selection in Data Mining and Knowledge Discovery 2008
(FSDM2008),
JMLR Workshop and Conference Proceedings, vol. xxx, pp.yyyzzz,
2008.
(Presented at Workshop on New Challenges for
Feature Selection in Data Mining and Knowledge Discovery 2008
(FSDM2008), Antwerp, Belgium, Sep. 15, 2008.)

Kanamori, T.
Multiclass Boosting Algorithms for Shrinkage Estimators of
Class Probability.
18th International Conference on Algorithmic Learning Theory,
Sendai International Center, Sendai, Japan, 2007.

Kanamori, T.
WorstCase Violation of Sampled Convex Programs for
Optimization with Uncertainty.
International Conference on Continuous Optimization, Hamilton,
Canada, 2007.

Kanamori, T. and Takeda, A.
WorstCase Violation of Sampled Convex Programs for
Optimization with Uncertainty.
International Symposium on Mathematical Programming,
Dio de Janeiro, Brazil, 2006.

Takeuchi, I., Nomura, K. and Kanamori, T.
The Entire Solution Path of Kernelbased Nonparametric
Conditional Quantile Estimator.
International Joint Conference on Neural Networks,
Vancouver, Canada, 2006,

Kanamori, T.
Integrability of weak learner on boosting.
The 2nd International Symposium on Information Geometry and
its Applications, pp. 300307, University of Tokyo, Tokyo, Japan, 2005.

Kanamori, T. and Takeuchi, I.
Estimators for Conditional Expectations under Asymmetric and
Heteroscedastic Error Distributions.
International Symposium on The Art of Statistical Metaware,
The Institute of Statistical Mathematics, Tokyo, Japan, 2005.

Kanamori T., Takenouchi, T., Eguchi, S., and Murata, N.
The most robust loss function for boosting.
Lecture Notes in Computer Science
Neural Information Processing: 11th International
Conference, ICONIP 2004, Calcutta, Vol. 3316, pp.496501,
Springer.

Kanamori T. and Takeuchi, I.
Robust Estimation of Conditional Mean by the Linear
Combination of Quantile Regressions.
International Conference on Robust Statistics,
Beijing, China, 2004.

Kanamori, T.
A New Sequential Algorithm for Regression Problems by using
Mixture Distribution.
In Proceedings of 2002 International Conference on
Artificial Neural Networks (ICANN'02),
pp. 535540, Madrid, Spain, August 2002.

Bengio, Y., Takeuchi, I. and Kanamori, T.
The Challenge of NonLinear Regression on Large Datasets with
Asymmetric Heavy Tails.
In Proceedings of the Joint Statistical Meeting. American
Statistical Association, New York, U.~S.~A., August 2002.

Shimodaira, H., and Kanamori, T.
Information Criteria for Predictive Inference with the
Weighted LogLikelihood and the Active Learning.
International Society for Bayesian Analysis,
Sixth World Meeting Hersonissos, Heraklion, Crete, May 2000.
Articles
 Kanamori, T,
Divergence Estimation using Densityratio and its Applications (in Japanese),
Japan Statistical Society, Sep., 2014.
 Kanamori, T,
Statistics statistical learning theory (in Japanese),
Suugaku seminar, May, 2011.
 Sugiyama, M., Suzuki, T., Kanamori, T. ,
Density ratio estimation: A comprehensive review.
In Statistical Experiment and Its Related Topics, Research Institute for Mathematical Sciences Kokyuroku, no.1703, pp.1031, 2010.
 Murata, N., Kanamori, T, Takenouchi, T.
Boosting and Learning algorithm (in Japanese),
The Journal of the Institute of Electronics, Information and Communication Engineers. 88(9), pp.724729, 2005.

Kanamori, T., Murata, N.
Boosting and Robustness (in Japanese)
The Journal of the Institute of Electronics, Information and Communication Engineers.
Vol. 86, No. 10, pp. 769772, 2003.
Books

T. Kanamori, T. Suzuki, I. Takeuchi, I. Sato
Continuous Optimization for Machine Learning (in Japanese)
Kodansya scientific, Dec. 2016.

T. Kanamori
Statistical Learning Theory (in Japanese)
Kodansya scientific, Aug. 2015.

M. Sugiyama, T. Suzuki, T. Kanamori
Density Ratio Estimation in Machine Learning
Cambridge University Press, Feb. 2012.

T. Kanamori, T. Takenouchi, N. Murata
Pattern Recognition in R (in Japanese)
Kyoritu Syuppan, Oct. 2009.

T. Kanamori,，K. Hatano, O. Watanabe
Boosting (in Japanese)
Morikita Syuppan, Sep. 2006.
Book Chapters

[Chapter contribution] Loss Functions and Risk Measures in Statistical Learning Theory
in Aspects of Modeling (in Japanese), Kindai Kagakusya, Sep. 2016.

[Translation] Randomized Optimization
in Handbook of Monte Carlo Methods (in Japanese), Asakura Syoten, Oct. 2014.

[Translation] Model Assessment and Selection
in The Elements of Statistical Learning (in Japanese), Kyoritsu Syoten, June. 2014.

[contribution] Statistical Learning Theory
in Handbook of Applied Mathematics (in Japanese)
Asakura Syoten, Nov. 2013.

[contribution] Ensemble Learning
in Encyclopedia of Mathematical Engineering (in Japanese)
Asakura Shoten, Nov. 2011.
 T. Kanamori, H. Shimodaira,
[Chapter contribution] Geometry of Covariate Shift with Applications to Active Learning
in Dataset Shift in Machine Learning, MIT Press, 2008.
kanamori's web site