Showing
1 - 1
results of
1
Skip to content
Map of visits
Language
Español
English
Français
Português
中文(繁體)
Home
Indexes
Statistics
Favorites
0
(Full)
All fields
Title
Author
Subject
Call number
ISBN/ISSN
Tag
Find
Advanced
Search options
Advanced search
Browse the Catalog
Browse Alphabetically
Explore Channels
New Items
Search History
Search Results
Search Results
Suggested topics within your search.
Suggested topics within your search.
C52
1
G11
1
american dollar
1
exchange-rate
1
mexican peso
1
modelo de Markov
purchasing power
1
índice Banamex-30
1
Showing
1 - 1
results of
1
Refine results
Una propuesta para evaluar pronósticos de rendimientos de acciones cuando las distribuciones empíricas no son normales estacionarias
Ramírez, José Carlos
,
Sandoval Saavedra, Rogelio
Estudios Económicos
2003
View details
View online
Add to book bag
Remove from Book Bag
Standalone Record
Search tools:
RSS Feed
Back
Refine results
Page will reload when a filter is selected or excluded.
Journal
Estudios Económicos
1 results
1
Indexes
CONAHCYT
1 results
1
Cengage Learning
1 results
1
DORA
1 results
1
Dialnet
1 results
1
EconLit
1 results
1
Gale OneFile: Informe Académico
1 results
1
Global Issues in Context
1 results
1
Google Scholar
1 results
1
HAPI
1 results
1
Handbook of Latin American Studies (HLAS)
1 results
1
IBSS
1 results
1
InfoTracCustom
1 results
1
JSTOR
1 results
1
LATINDEX
1 results
1
PKP Index
1 results
1
RePEc
1 results
1
Redalyc
1 results
1
Scielo México
1 results
1
The Journal of Economic Literature
1 results
1
Ulrich’s International Periodicals Directory
1 results
1
Índice bibliográfico Publindex
1 results
1
see all…
Format
Online
1 results
1
Author
Ramírez, José Carlos
1 results
1
Sandoval Saavedra, Rogelio
1 results
1
Language
Spanish
1 results
1
Year of publication
From:
To:
Page will reload when a filter is removed.
Reset Filters
Applied Filters:
Suggested topics:
Remove Filter
modelo de Markov
Page will reload when a filter is removed.
Reset Filters
Show filters (1)
Suggested topics:
Remove Filter
modelo de Markov