Refereed Journal Papers

  1. T. Kanamori, S. Fujiwara, A. Takeda
    Robustness of Learning Algorithms using Hinge Loss with Outlier Indicators.
    Neural Networks, to appear.
  2. T. Takenouchi, T. Kanamori,
    Statistical Inference with Unnormalized Discrete Models and Localized Homogeneous Divergences.
    Journal of Machine Learning Research, to appear.
  3. T. Kanamori, T. Takenouchi,
    Graph-based Composite Local Bregman Divergences on Discrete Sample Spaces.
    Neural Networks, to appear.
  4. K. Matsui, W. Kumagai, T. Kanamori,
    Parallel Distributed Block Coordinate Descent Methods based on Pairwise Comparison Oracle.
    Journal of Global Optimization, to appear.
  5. S. Fujiwara, A. Takeda, T. Kanamori,
    DC Algorithm for Extended Robust Support Vector Machine.
    Neural Computation, vol. 29, num. 5, May, 2017.
  6. T. Kanamori, S. Fujiwara, A. Takeda
    Breakdown Point of Robust Support Vector Machines.
    Entropy, vol. 19, no. 2, 83, February 2017
  7. T. Kanamori,
    Efficiency Bound of Local Z-Estimators on Discrete Sample Spaces.
    Entropy, vol. 18, no. 7, 273, July 2016,
  8. T. Kanamori, H. Fujisawa
    Robust Estimation under Heavy Contamination using Unnormalized Models.
    Biometrika, vol. 102, no. 3, pp. 559-572, Sep. 2015.
  9. A. Takeda, S. Fujiwara, T. Kanamori,
    Extended Robust Support Vector Machine Based on Financial Risk Minimization.
    Neural Computation, vol. 26, num. 11, pp. 2541-2569, Nov. 2014.
  10. T. Kanamori and H. Fujisawa,
    Affine Invariant Divergences associated with Proper Composite Scoring Rules and their Applications.
    Bernoulli, vol. 20, No. 4, pp. 2278-2304, Nov. 2014
  11. T. Kanamori and A. Takeda,
    A Numerical Study of Learning Algorithms on Stiefel Manifold.
    Computational Management Science, vol. 11, Issue 4, pp 319-340, Oct. 2014.
  12. A. Takeda, T. Kanamori,
    Using Financial Risk for Analyzing Generalization Performance of Machine Learning Models.
    Neural Networks, vol. 57, pp. 29-38, Sep, 2014.
  13. T. D. Nguyen, M. C. du Plessis, T. Kanamori, M. Sugiyama,
    Constrained Least-Squares Density-Difference Estimation.
    IEICE Transactions on Information and Systems, vol. E97-D, no. 7, pp. 1822-1829, July, 2014.
  14. T. Kanamori,
    Scale-Invariant Divergences for Density Functions.
    Entropy, vol 16(5), pp. 2611-2628, May 2014.
  15. T. Kanamori and M. Sugiyama,
    Statistical Analysis of Distance Estimators with Density Differences and Density Ratios.
    Entropy, vol. 16 (2), pp. 921-942, Feb. 2014.
  16. T. Kanamori, A. Ohara,
    A Bregman extension of quasi-Newton updates II: analysis of robustness properties.
    Journal of Computational and Applied Mathematics, vol. 253, pp. 104-122, Dec. 2013.
  17. M. Sugiyama, T. Kanamori, T. Suzuki, M. C. du Plessis, S. Liu, I. Takeuchi,
    Density Difference Estimation.
    Neural Computation, vol. 25(10), pp. 2734-2775, Oct. 2013.
  18. T. Kanamori, A. Takeda, T. Suzuki,
    Conjugate Relation between Loss Functions and Uncertainty Sets in Classification Problems.
    Journal of Machine Learning Research, vol. 14, pp. 1461-1504, June, 2013.
  19. M. Sugiyama, S. Liu, M. C. du Plessis, Y. Yamanaka, M. Yamada, T. Suzuki, T. Kanamori,
    Direct Divergence Approximation between Probability Distributions and Its Applications in Machine Learning.
    Journal of Computing Science and Engineering, vol. 7, no. 2, pp.99-111, June, 2013.
  20. M. Yamada, T. Suzuki, T. Kanamori, H. Hachiya, M. Sugiyama,
    Relative Density-Ratio Estimation for Robust Distribution Comparison.
    Neural Computation, vol. 25, No. 5, pp. 1324-1370, May 2013.
  21. M. Kawakita, T. Kanamori,
    Semi-Supervised Learning with Density-Ratio Estimation. [arXiv]
    Machine Learning, Volume 91, Issue 2, pp 189-209, May 2013.
  22. T. Kanamori,
    Statistical Models and Learning Algorithms for Ordinal Regression Problems.
    Information Fusion, vol. 14, issue 2, pp. 199-207, April 2013.
  23. T. Kanamori, T. Takenouchi,
    Improving LogitBoost with Prior Knowledge.
    Information Fusion, vol. 14, issue 2, pp. 208-219, April 2013.
  24. A. Takeda, H. Mitsugi, T. Kanamori,
    A Unified Classification Model Based on Robust Optimization.
    Neural Computation, Vol. 25, No. 3, Pages 759-804, March 2013.
  25. T. Kanamori, T. Suzuki, M. Sugiyama,
    Computational Complexity of Kernel-Based Density-Ratio Estimation: A Condition Number Analysis. [arXiv]
    Machine Learning, vol. 90, issue 3, pp. 431-460, March 2013.
  26. T. Kanamori, A. Ohara,
    A Bregman Extension of quasi-Newton updates I: An Information Geometrical framework. [arXiv]
    Optimization Methods and Software, vol. 28, issue 1, pp. 96-123, February 2013.
  27. M. Sugiyama, T. Suzuki, T. Kanamori,
    Density-ratio matching under the Bregman divergence: A unified framework of density-ratio estimation. [site]
    Annals of the Institute of Statistical Mathematics, vol. 64, no. 5, pp. 1009-1044, October 2012.
  28. T. Kanamori, H. Uehara, M. Jimbo,
    Pooling Design and Bias Correction in DNA Library Screening. [arXiv]
    Journal of Statistical Theory and Practice, vol. 6, issue 1, pp. 220-238, March 2012.
  29. T. Kanamori, T. Suzuki, M. Sugiyama,
    Statistical analysis of kernel-based least-squares density-ratio estimation. [site]
    Machine Learning, vol. 86, Issue 3, pp. 335-367, March 2012.
  30. T. Kanamori, T. Suzuki, M. Sugiyama,
    f-divergence estimation and two-sample homogeneity test under semiparametric density-ratio models. [arXiv]
    IEEE Transactions on Information Theory, Vol. 58, Issue 2, pp. 708-720, February 2012.
  31. T. Kanamori, A. Takeda,
    Worst-Case Violation of Sampled Convex Programs for Optimization with Uncertainty. [arXiv]
    Journal of Optimization Theory and Applications, vol. 152, Issue 1, pp.171-197, January 2012.
  32. H. Shimodaira, T. Kanamori, M. Aoki, K. Mine,
    Multiscale Bagging and its Applications. [site]
    IEICE Transactions on Information and Systems, Volume E94-D No.10, pp.1924-1932, October 2011.
  33. M. Sugiyama, T. Suzuki, Y. Itho, T. Kanamori, M. Kimura,
    Least-Squares Two-Sample Test. [site]
    Neural Networks, vol. 24, issue 7, pp. 735-751, September, 2011.
  34. S. Hido, Y. Tsuboi, H. Kashima, M. Sugiyama, T. Kanamori,
    Statistical Outlier Detection Using Direct Density Ratio Estimation. [site]
    Knowledge and Information Systems, vol. 26, num. 2, pp. 309-336, August, 2011.
  35. M. Sugiyama, M. Yamada, von Bunau P., T. Suzuki, T. Kanamori, M. Kawanabe,
    Direct density-ratio estimation with dimensionality reduction via least-squares hetero-distributional subspace search. [site]
    Neural Networks, vol. 24, pp. 183-198, March, 2011.
  36. T. Kanamori,
    Deformation of Log-Likelihood Loss Function for Multiclass Boosting. [site]
    Neural Networks, vol. 23, issue 7, pp. 843-864, May, 2010.
  37. T. Kanamori, T. Suzuki, M. Sugiyama,
    Theoretical Analysis of Density Ratio Estimation. [site]
    IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences, vol. E93-A, no. 4, pp. 787-798, April, 2010.
  38. M. Sugiyama, I. Takeuchi, T. Suzuki, T. Kanamori, H. Hachiya, D. Okanohara,
    Least-Squares Conditional Density Estimation. [site]
    IEICE Transactions on Information and Systems, vol.E93-D, no.3, pp.583-594, March, 2010.
  39. A. Takeda, T. Kanamori,
    A Robust Approach Based on Conditional Value-at-Risk Measure to Statistical Learning Problems. [site]
    European Journal of Operational Research, vol. 198, issue 1, pp. 287-296, Oct., 2009.
  40. M. Sugiyama, T. Kanamori, T. Suzuki, Shohei Hido, Jun Sese, Ichiro Takeuchi, and Liwei Wang,
    A Density-ratio Framework for Statistical Data Processing. [site]
    IPSJ Computer Vision and Application. vol. 1, pp. 183-208, Sep., 2009
  41. T. Kanamori, S. Hido, M. Sugiyama,
    A Least-squares Approach to Direct Importance Estimation. [site]
    Journal of Machine Learning Research. 10(Jul):1391-1445, July, 2009.
  42. I. Takeuchi, K. Nomura, T. Kanamori,
    Nonparametric Conditional Density Estimation Using Piecewise-Linear Path Following for Kernel Quantile Regression. [site]
    Neural Computation, vol. 21, num. 2, pp. 533-559, Feb., 2009.
  43. T. Suzuki, M. Sugiyama, T. Kanamori, and J. Sese,
    Mutual information estimation reveals global associations between stimuli and biological processes. [site]
    BMC Bioinformatics, vol. 10, no. 1, pp.S52, Jan., 2009.
  44. T. Takenouchi, S. Eguchi, N. Murata, T. Kanamori,
    Robust Boosting Algorithm against Mislabelling in Multi-Class Problems. [site]
    Neural Computation, vol. 20, num. 6, pp. 1596-1630, June, 2008.
  45. T. Kanamori,
    Multiclass Boosting Algorithms for Shrinkage Estimators of Class Probability. [site]
    IEICE Transactions on Information and Systems, Vol.E90-D, No.12, pp. 2033-2042, Dec., 2007.
  46. T. Kanamori,
    Pool-based Active Learning with Optimal Sampling Distribution and its Information Geometrical Interpretation. [site]
    Neurocomputing, Vol. 71, Issue 1-3, pp. 353-362, Dec., 2007.
  47. T. Kanamori, , T. Takenouchi, S. Eguchi, N. Murata,
    Robust Loss Functions for Boosting. [site]
    Neural Computation, 19(8), pp. 2183-2244, Aug., 2007
  48. T. Kanamori, T. Takenouchi, N. Murata,
    Geometrical Structure of Boosting Algorithm.
    New Generation Computing, Tutorial Series on Brain-Inspired Computing, Part 6, 25(1):117-141, Nov., 2006.
  49. T. Kanamori, and I. Takeuchi,
    Conditional Mean Estimation under Asymmetric and Heteroscedastic Error by Linear Combination of Quantile Regressions.
    Computational Statistics and Data Analysis, Vol 50, Issue 12, pp 3605-3618, Aug., 2006.
  50. N. Murata, T. Takenouchi, T. Kanamori, S. Eguchi,
    Information Geometry of U-Boost and Bregman Divergence.
    Neural Computation, 16(7):1437-1481, July 2004.
  51. T. Kanamori, H. Shimodaira,
    Active Learning algorithm using the maximum weighted log-likelihood estimator
    Journal of Statistical Planning and Inference, Vol. 116, Issue 1, pp. 149-162, Sep., 2003.
  52. I. Takeuchi, Y. Bengio, T. Kanamori,
    Robust Regression with Asymmetric Heavy-Tail Noise distributions.
    Neural Computation, Vol. 14, Num. 10, pp. 2469-2496, Oct., 2002.
  53. T. Kanamori,
    Statistical Asymptotic Theory of Active Learning.
    Annals of the Institute of Statistical Mathematics, Vol. 54, Num. 3, pp. 459-475, Sep., 2002.
  54. T. Kanamori, H. Shimodaira,
    An Active Learning Algorithm Using an Information Criterion for the Maximum Weighted Log-likelihood Estimator.
    Proceedings of the Institute of Statistical Mathematics, Vol, 48, No. 1, 197-212, June, 2000.
  55. T. Kanamori,
    Active Learning Algorithm using Maximum Weighted Likelihood Estimator.
    Bulletin of the Computational Statistics in Japan, vol. 11, Num. 2, pp. 65-75, Oct., 1998.

International Conference

  1. Takenouchi T, Kanamori T.
    Empirical Localization of Homogeneous Divergences on Discrete Sample Spaces.
    The Neural Information Processing Systems (NIPS 2015), poster & spotlight, to appear.
  2. Kanamori T.
    Legendre Transformation in Machine Learning.
    Workshop: Information Geometry for Machine Learning, December 2014.
  3. Fujisawa, H., Kanamori T.
    Affine invariant divergences with applications to robust statistics.
    The 7th International Conference of the ERCIM WG on Computational and Methodological Statistics (ERCIM 2014), the University of Pisa, Italy, 6-8 December 2014.
  4. Kanamori T., Fujisawa, H.
    Affine Invariant Divergences and their Applications.
    The 3rd Institute of Mathematical Statistics, Asia Pacific Rim Meeting, June 29-July 3, 2014.
  5. Sugiyama M., Kanamori T., Suzuki T., Plessis M., Liu S., Takeuchi I.
    Density-Difference Estimation.
    The Neural Information Processing Systems (NIPS 2012), Lake Tahoe, Nevada, United States, 3-8 Dec., 2012.
  6. Kanamori T., Takeda A.
    Non-Convex Optimization on Stiefel Manifold and Applications to Machine Learning.
    The 19th International Conference on Neural Information Processing (ICONIP 2012), Doha, Qatar, 12-15 Nov., 2012.
  7. Takeda A. Kanamori T., Mitsugi H.
    Robust optimization-based classification method.
    The 21st International Symposium on Mathematical Programming (ISMP 2012), Berlin, Germany, 19-24 Aug., 2012.
  8. Kanamori T., Suzuki, T., Sugiyama, M.
    f-divergence estimation and two-sample test under semi-parametric density ratio models.
    The 2nd Institute of Mathematical Statistics, Asia Pacific Rim Meeting (ims-APRM 2012), Tsukuba, Japan, 2-4 July, 2012.
  9. Takeda, A., Mitsugi, H., Kanamori, T.
    A Unified Robust Classification Model.
    29th International Conference on Machine Learning (ICML2012), Edinburgh, Scotland, Jun. 26-Jul. 1, 2012.
  10. Kanamori, T., Takeda, A., Suzuki, T.
    A Conjugate Property between Loss Functions and Uncertainty Sets in Classification Problems.
    25th International Conference on Learning Theory (COLT2012), Edinburgh, Scotland, Jun. 25-Jun. 27, 2012.
  11. Yamada, M., Suzuki, T., Kanamori, T., Hachiya, H., & Sugiyama, M.
    Relative density-ratio estimation for robust distribution comparison.
    Presented at Neural Information Processing Systems (NIPS2011), Granada, Spain, Dec. 13-15, 2011
  12. Shimodaira H.Kanamori T., Masayoshi A., Kouta Mine
    Multiscale Bagging with Applications to Classification and Active Learning.
    The 2nd Asian Conference on Machine Learning, Nov. 2010.
  13. A. Masayoshi, Kanamori T., Shimodaira H
    Multiscale-bagging with Applications to Classification.
    The 2nd Asian Conference on Machine Learning, Nov. 2010.
  14. Kanamori T. and Ohara Atsumi
    A Bregman extension of quasi-Newton updates.
    Information Geometry and its Applications, Germany, Aug. 2010.
  15. Sugiyama, M., Takeuchi, I., Kanamori, T., Suzuki, T., Hachiya, H., & Okanohara, D.,
    Conditional density estimation via least-squares density ratio estimation.
    In Proceedings of Thirteenth International Conference on Artificial Intelligence and Statistics (AISTATS2010), JMLR Workshop and Conference Proceedings, vol.9, pp.781-788, Sardinia, Italy, May 13-15, 2010.
  16. Sugiyama, M., Hara, S., von Bünau, P., Suzuki, T., Kanamori, T., & Kawanabe, M.,
    Direct density ratio estimation with dimensionality reduction.
    In S. Parthasarathy, B. Liu, B. Goethals, J. Pei, and C. Kamath (Eds.), Proceedings of the 10th SIAM International Conference on Data Mining (SDM2010), pp.595-606, Columbus, Ohio, USA, Apr. 29-May 1, 2010.
  17. T. Kanamori
    Efficient direct importance estimation for covariate shift adaptation and outlier detection.
    The 1st Institute of Mathematical Statistics, Asia Pacific Rim Meeting, Seoul, June 28-July 1, 2009
  18. T. Kanamori, T. Suzuki, M. Sugiyama,
    Condition Number Analysis of Kernel-based Density Ratio Estimation.
    ICML workshop on Numerical Mathematics in Machine Learning, Montreal Canada, June 2009.
  19. Suzuki, Sugiyama, Kanamori, Sese,
    Mutual Information Estimation Reveals Global Associations between Stimuli and Biological Process.
    The 7th Asia Pacific Bioinformatics Conference (APBC2009) Beijing, China, 13-16 January 2009.
  20. Kanamori, T.
    A Least-squares Approach to Direct Importance Estimation and its Applications..
    Joint Session of the CSA, JSS and KSS at 2008 Statistical Symposium, China, Taipei, Dec. 19, 2008.
  21. Hido, S., Tsuboi, Y., Kashima, H., Sugiyama, M., Kanamori, T..
    Inlier-based outlier detection via direct density ratio estimation.
    In xxx and xxx (Eds.), Proceedings of IEEE International Conference on Data Mining (ICDM2008), pp.xxx-xxx, Pisa, Italy, Dec. 15-19, 2008.
  22. Takafumi Kanamori, Masashi Sugiyama, and Shohei Hido
    Efficient Direct Density Ratio Estimation for Non-stationarity Adaptation and Outlier Detection.
    NIPS, 2008
  23. Taiji Suzuki, Masashi Sugiyama, Jun Sese, and Takafumi Kanamori.
    A least-squares approach to mutual information estimation with application in variable selection.
    Workshop on New Challenges for Feature Selection in Data Mining and Knowledge Discovery 2008 (FSDM2008), Antwerp, Belgium, Sep. 15, 2008
  24. Suzuki, T., Sugiyama, M., Sese, J. and Kanamori, T.
    Approximating mutual information by maximum likelihood density ratio estimation.
    In xxx and yyy (Eds.), Proceedings of the Workshop on New Challenges for Feature Selection in Data Mining and Knowledge Discovery 2008 (FSDM2008),
    JMLR Workshop and Conference Proceedings, vol. xxx, pp.yyy-zzz, 2008.
    (Presented at Workshop on New Challenges for Feature Selection in Data Mining and Knowledge Discovery 2008 (FSDM2008), Antwerp, Belgium, Sep. 15, 2008.)
  25. Kanamori, T.
    Multiclass Boosting Algorithms for Shrinkage Estimators of Class Probability.
    18th International Conference on Algorithmic Learning Theory, Sendai International Center, Sendai, Japan, 2007.
  26. Kanamori, T.
    Worst-Case Violation of Sampled Convex Programs for Optimization with Uncertainty.
    International Conference on Continuous Optimization, Hamilton, Canada, 2007.
  27. Kanamori, T. and Takeda, A.
    Worst-Case Violation of Sampled Convex Programs for Optimization with Uncertainty.
    International Symposium on Mathematical Programming, Dio de Janeiro, Brazil, 2006.
  28. Takeuchi, I., Nomura, K. and Kanamori, T.
    The Entire Solution Path of Kernel-based Nonparametric Conditional Quantile Estimator.
    International Joint Conference on Neural Networks, Vancouver, Canada, 2006,
  29. Kanamori, T.
    Integrability of weak learner on boosting.
    The 2nd International Symposium on Information Geometry and its Applications, pp. 300-307, University of Tokyo, Tokyo, Japan, 2005.
  30. Kanamori, T. and Takeuchi, I.
    Estimators for Conditional Expectations under Asymmetric and Heteroscedastic Error Distributions.
    International Symposium on The Art of Statistical Metaware, The Institute of Statistical Mathematics, Tokyo, Japan, 2005.
  31. Kanamori T., Takenouchi, T., Eguchi, S., and Murata, N.
    The most robust loss function for boosting.
    Lecture Notes in Computer Science Neural Information Processing: 11th International Conference, ICONIP 2004, Calcutta, Vol. 3316, pp.496-501, Springer.
  32. Kanamori T. and Takeuchi, I.
    Robust Estimation of Conditional Mean by the Linear Combination of Quantile Regressions.
    International Conference on Robust Statistics, Beijing, China, 2004.
  33. Kanamori, T.
    A New Sequential Algorithm for Regression Problems by using Mixture Distribution.
    In Proceedings of 2002 International Conference on Artificial Neural Networks (ICANN'02), pp. 535-540, Madrid, Spain, August 2002.
  34. Bengio, Y., Takeuchi, I. and Kanamori, T.
    The Challenge of Non-Linear Regression on Large Datasets with Asymmetric Heavy Tails.
    In Proceedings of the Joint Statistical Meeting. American Statistical Association, New York, U.~S.~A., August 2002.
  35. Shimodaira, H., and Kanamori, T.
    Information Criteria for Predictive Inference with the Weighted Log-Likelihood and the Active Learning.
    International Society for Bayesian Analysis, Sixth World Meeting Hersonissos, Heraklion, Crete, May 2000.

Articles

  1. Kanamori, T,
    Divergence Estimation using Density-ratio and its Applications (in Japanese),
    Japan Statistical Society, Sep., 2014.
  2. Kanamori, T,
    Statistics --statistical learning theory (in Japanese),
    Suugaku seminar, May, 2011.
  3. Sugiyama, M., Suzuki, T., Kanamori, T. ,
    Density ratio estimation: A comprehensive review.
    In Statistical Experiment and Its Related Topics, Research Institute for Mathematical Sciences Kokyuroku, no.1703, pp.10-31, 2010.
  4. Murata, N., Kanamori, T, Takenouchi, T.
    Boosting and Learning algorithm (in Japanese),
    The Journal of the Institute of Electronics, Information and Communication Engineers. 88(9), pp.724-729, 2005.
  5. Kanamori, T., Murata, N.
    Boosting and Robustness (in Japanese)
    The Journal of the Institute of Electronics, Information and Communication Engineers. Vol. 86, No. 10, pp. 769-772, 2003.

Books

  1. T. Kanamori, T. Suzuki, I. Takeuchi, I. Sato
    Continuous Optimization for Machine Learning (in Japanese)
    Kodansya scientific, Dec. 2016.
     
  2. T. Kanamori
    Statistical Learning Theory (in Japanese)
    Kodansya scientific, Aug. 2015.
     
  3. M. Sugiyama, T. Suzuki, T. Kanamori
    Density Ratio Estimation in Machine Learning
    Cambridge University Press, Feb. 2012.
     
  4. T. Kanamori, T. Takenouchi, N. Murata
    Pattern Recognition in R (in Japanese)
    Kyoritu Syuppan, Oct. 2009.
     
  5. T. Kanamori,,K. Hatano, O. Watanabe
    Boosting (in Japanese)
    Morikita Syuppan, Sep. 2006.
     

Book Chapters

  1. [Chapter contribution] Loss Functions and Risk Measures in Statistical Learning Theory
    in Aspects of Modeling (in Japanese), Kindai Kagakusya, Sep. 2016.
        
  2. [Translation] Randomized Optimization
    in Handbook of Monte Carlo Methods (in Japanese), Asakura Syoten, Oct. 2014.
        
  3. [Translation] Model Assessment and Selection
    in The Elements of Statistical Learning (in Japanese), Kyoritsu Syoten, June. 2014.
  4. [contribution] Statistical Learning Theory
    in Handbook of Applied Mathematics (in Japanese)
    Asakura Syoten, Nov. 2013.
     
  5. [contribution] Ensemble Learning
    in Encyclopedia of Mathematical Engineering (in Japanese)
    Asakura Shoten, Nov. 2011.
     
  6. T. Kanamori, H. Shimodaira,
    [Chapter contribution] Geometry of Covariate Shift with Applications to Active Learning
    in Dataset Shift in Machine Learning, MIT Press, 2008.
     

kanamori's web site