Minggu, 07 Oktober 2012

[smf_addin] Digest Number 2365

15 New Messages

Digest #2365
1a
Re: index symbols by "ntknow" ntknow
1b
Re: index symbols by "zarathustra_winced@yahoo.com" zarathustra_winced
2a
Re: Welcome back Randy!!! by "NicholasDavid" davidnicholas738
2b
Re: Welcome back Randy!!! by "Ron Spruell" hashky
2c
Re: Welcome back Randy!!! by "Kermit W. Prather" kermitpra
3d
Re: Screen scrape data to plot intraday chart by "metastkuser100" metastkuser100
7a
Re: Fidelity Website for Bonds by "Randy Harmelink" rharmelink

Messages

Sun Oct 7, 2012 7:40 am (PDT) . Posted by:

"ntknow" ntknow

The ^GSPC and ^IXIC both work fine. However, ^DJI returns "missing symbols list." I see from searching the message posts that others have the same result but I do not find any answer that works for me.

BTW, so glad to find this add-in. Used to use MSNMoney add-in but goofed up and uninstalled it because of a problem...now I can't find a download for it that works :(

But if I can find a symbol that works in this for the DJIA I'll be all fixed.

Thanks Randy for your work.

--- In smf_addin@yahoogroups.com, Randy Harmelink <rharmelink@...> wrote:
>
> ^DJI no longer works, because Yahoo is no longer licensed to provide files
> of their quotes data.
>
> You just need to use the Yahoo symbol lookup for the tickers you want --
> S&P 500 would be ^GSPC, NASDAQ 100 would be ^NDX, etc. But Yahoo may not
> have historical data for the lesser indexes.
>
> On Thu, Oct 4, 2012 at 12:49 PM, ntknow <livinsfun@...> wrote:
>
> > New to this but have the individual stock prices working. Can some one
> > tell me the symbols to use for the Dow, S&P500, NASDAQ, and Philadelphia
> > Gold & Silver Index. The Yahoo ^symbols don't seem to work.
> >
>

Sun Oct 7, 2012 7:52 am (PDT) . Posted by:

"zarathustra_winced@yahoo.com" zarathustra_winced

Djia historical data is no longer provided by yahoo.

Sent via BlackBerry by AT&T

-----Original Message-----
From: "ntknow" <livinsfun@hotmail.com>
Sender: smf_addin@yahoogroups.com
Date: Sun, 07 Oct 2012 14:39:56
To: <smf_addin@yahoogroups.com>
Reply-To: smf_addin@yahoogroups.com
Subject: [smf_addin] Re: index symbols

The ^GSPC and ^IXIC both work fine. However, ^DJI returns "missing symbols list." I see from searching the message posts that others have the same result but I do not find any answer that works for me.

BTW, so glad to find this add-in. Used to use MSNMoney add-in but goofed up and uninstalled it because of a problem...now I can't find a download for it that works :(

But if I can find a symbol that works in this for the DJIA I'll be all fixed.

Thanks Randy for your work.

--- In smf_addin@yahoogroups.com, Randy Harmelink <rharmelink@...> wrote:
>
> ^DJI no longer works, because Yahoo is no longer licensed to provide files
> of their quotes data.
>
> You just need to use the Yahoo symbol lookup for the tickers you want --
> S&P 500 would be ^GSPC, NASDAQ 100 would be ^NDX, etc. But Yahoo may not
> have historical data for the lesser indexes.
>
> On Thu, Oct 4, 2012 at 12:49 PM, ntknow <livinsfun@...> wrote:
>
> > New to this but have the individual stock prices working. Can some one
> > tell me the symbols to use for the Dow, S&P500, NASDAQ, and Philadelphia
> > Gold & Silver Index. The Yahoo ^symbols don't seem to work.
> >
>




------------------------------------

Yahoo! Groups Links



Sun Oct 7, 2012 7:55 am (PDT) . Posted by:

"NicholasDavid" davidnicholas738



My best wishes to you Randy. Good Luck with the weight loss. You can do it.

David

----- Original Message -----

From: "Randy Harmelink" <rharmelink@gmail.com>
To: "smf addin" <smf_addin@yahoogroups.com>
Sent: Saturday, October 6, 2012 4:25:17 PM
Subject: Re: [smf_addin] Welcome back Randy!!!

 

Appreciate it. I was in the hospital for 4 days and then rehab for 17 days, because of a continuing leg/knee problem. Still not solved, and won't be unless I lose a lot of weight, but at least I can transfer well enough to get around.

On Sat, Oct 6, 2012 at 2:21 PM, Ron Spruell < hashky@yahoo.com > wrote:

I didn't know how long you were going to be AWOL, so I started trying to answer some of the questions that I thought I knew the answers.

Sun Oct 7, 2012 9:43 am (PDT) . Posted by:

"Ron Spruell" hashky

I know the feeling.  Getting old (and losing weight) is not for wimps.  Good luck with the losing weight.  Like the rest of us, you can't do anything about getting older.

>________________________________
> From: Randy Harmelink <rharmelink@gmail.com>
>To: smf_addin@yahoogroups.com
>Sent: Saturday, October 6, 2012 4:25 PM
>Subject: Re: [smf_addin] Welcome back Randy!!!
>
>

>Appreciate it. I was in the hospital for 4 days and then rehab for 17 days, because of a continuing leg/knee problem. Still not solved, and won't be unless I lose a lot of weight, but at least I can transfer well enough to get around.
>
>
>On Sat, Oct 6, 2012 at 2:21 PM, Ron Spruell <hashky@yahoo.com> wrote:
>
>
>>
>>I didn't know how long you were going to be AWOL, so I started trying to answer some of the questions that I thought I knew the answers.
>>
>
>
>
>

Sun Oct 7, 2012 11:37 am (PDT) . Posted by:

"Kermit W. Prather" kermitpra

Welcome back, Randy

Glad you are okay.

Hope all goes well with the weight loss program. Tried it myself and it takes discipline and exercise. I tried once to just reduce food intake and that did nothing. Exercising for many of us is limited but there are things you can do sitting down that help.

Good luck & welcome back.
Kermit

From: smf_addin@yahoogroups.com [mailto:smf_addin@yahoogroups.com] On Behalf Of Randy Harmelink
Sent: Saturday, October 06, 2012 5:25 PM
To: smf_addin@yahoogroups.com
Subject: Re: [smf_addin] Welcome back Randy!!!


Appreciate it. I was in the hospital for 4 days and then rehab for 17 days, because of a continuing leg/knee problem. Still not solved, and won't be unless I lose a lot of weight, but at least I can transfer well enough to get around.
On Sat, Oct 6, 2012 at 2:21 PM, Ron Spruell <hashky@yahoo.com> wrote:

I didn't know how long you were going to be AWOL, so I started trying to answer some of the questions that I thought I knew the answers.

Sun Oct 7, 2012 9:48 am (PDT) . Posted by:

"freebelvin" freebelvin

Greetings to all via my first post to this group.

I recently joined this group because you all do some things I wanted to learn. I've been studying the literature on the site and I'm very impressed with the level of technology, and wish to express my thanks to Randy and others for developing these tools and generously sharing them with others.

It turns out that I'm an engineer and have a few tricks too. I thought I might take this opportunity to describe what I've been doing, and the question in this post is closely related to my approach. First let me say that I do not in any way put this out as any form of technology competition! I only bring this up as a way of expanding people's awareness of what is possible.

What I have done is to use the same URL linking and fetching mechanisms embedded inside the excel macros, but I have embedded them instead into python scripts. This leads to much more programming effort, but also provides much more flexibility in handling the data. In the example below, a script could be coded with the website URL and fields to extract and fetch the data on a schedule. The data could then be printed to the screen, or additional algorithms could be attached to generate alerts, perform follow-on calculations, etc. The possibilities are broad, but I admit, this does require a level of software skill. And I will comment that as a C programmer, learning python syntax has been quite an adventure for me.

OK, that's it. If anyone wishes to know more about this approach to processing web-based market data, please let me know and I can follow up. Again, please consider this to be an augmentation of the technology already existing in this group, and not as the beginning of any type of competition!

belvin

--- In smf_addin@yahoogroups.com, "metastkuser100" <metastkuser100@...> wrote:
>
> I would like to:
>
> a) Screen scrape two numbers from a website, every 15 minutes, which then populate a table in Excel, throughout market hours, M-F.
>
> b) Then plot an Excel chart based on the data in the table. As the day progresses, the table and the chart gets automatically updated every 15 minutes.
>
> This is possibly way beyond my abilities, but I'm willing to give a real good try!
>
> Can the first part be done by SMF plugin?
>

Sun Oct 7, 2012 11:14 am (PDT) . Posted by:

"MS" metastkuser100

I'd be happy to pay a few bucks to a Python/Ruby/VBA/whatever programmer to put this spreadsheet together for me.

Very simple specs: scrape data off website automatically every 15 minutes, store in spreadsheet as static values, display data as an Excel line chart. 26 data points per day, for many days. Bells/whistles like alerts, number crunching are afterthoughts. andysmith999@hotmail.com

--- In smf_addin@yahoogroups.com, "freebelvin" <freebelvin@...> wrote:
>
> Greetings to all via my first post to this group.
>
> I recently joined this group because you all do some things I wanted to learn. I've been studying the literature on the site and I'm very impressed with the level of technology, and wish to express my thanks to Randy and others for developing these tools and generously sharing them with others.
>
> It turns out that I'm an engineer and have a few tricks too. I thought I might take this opportunity to describe what I've been doing, and the question in this post is closely related to my approach. First let me say that I do not in any way put this out as any form of technology competition! I only bring this up as a way of expanding people's awareness of what is possible.
>
> What I have done is to use the same URL linking and fetching mechanisms embedded inside the excel macros, but I have embedded them instead into python scripts. This leads to much more programming effort, but also provides much more flexibility in handling the data. In the example below, a script could be coded with the website URL and fields to extract and fetch the data on a schedule. The data could then be printed to the screen, or additional algorithms could be attached to generate alerts, perform follow-on calculations, etc. The possibilities are broad, but I admit, this does require a level of software skill. And I will comment that as a C programmer, learning python syntax has been quite an adventure for me.
>
> OK, that's it. If anyone wishes to know more about this approach to processing web-based market data, please let me know and I can follow up. Again, please consider this to be an augmentation of the technology already existing in this group, and not as the beginning of any type of competition!
>
>
> belvin
>
>
>
> --- In smf_addin@yahoogroups.com, "metastkuser100" <metastkuser100@> wrote:
> >
> > I would like to:
> >
> > a) Screen scrape two numbers from a website, every 15 minutes, which then populate a table in Excel, throughout market hours, M-F.
> >
> > b) Then plot an Excel chart based on the data in the table. As the day progresses, the table and the chart gets automatically updated every 15 minutes.
> >
> > This is possibly way beyond my abilities, but I'm willing to give a real good try!
> >
> > Can the first part be done by SMF plugin?
> >
>

Sun Oct 7, 2012 11:16 am (PDT) . Posted by:

"MS" metastkuser100

andysmith999 at hotmail dot com

--- In smf_addin@yahoogroups.com, "MS" <metastkuser100@...> wrote:
>
> I'd be happy to pay a few bucks to a Python/Ruby/VBA/whatever programmer to put this spreadsheet together for me.
>
> Very simple specs: scrape data off website automatically every 15 minutes, store in spreadsheet as static values, display data as an Excel line chart. 26 data points per day, for many days. Bells/whistles like alerts, number crunching are afterthoughts. andysmith999@...
>
>
>
> --- In smf_addin@yahoogroups.com, "freebelvin" <freebelvin@> wrote:
> >
> > Greetings to all via my first post to this group.
> >
> > I recently joined this group because you all do some things I wanted to learn. I've been studying the literature on the site and I'm very impressed with the level of technology, and wish to express my thanks to Randy and others for developing these tools and generously sharing them with others.
> >
> > It turns out that I'm an engineer and have a few tricks too. I thought I might take this opportunity to describe what I've been doing, and the question in this post is closely related to my approach. First let me say that I do not in any way put this out as any form of technology competition! I only bring this up as a way of expanding people's awareness of what is possible.
> >
> > What I have done is to use the same URL linking and fetching mechanisms embedded inside the excel macros, but I have embedded them instead into python scripts. This leads to much more programming effort, but also provides much more flexibility in handling the data. In the example below, a script could be coded with the website URL and fields to extract and fetch the data on a schedule. The data could then be printed to the screen, or additional algorithms could be attached to generate alerts, perform follow-on calculations, etc. The possibilities are broad, but I admit, this does require a level of software skill. And I will comment that as a C programmer, learning python syntax has been quite an adventure for me.
> >
> > OK, that's it. If anyone wishes to know more about this approach to processing web-based market data, please let me know and I can follow up. Again, please consider this to be an augmentation of the technology already existing in this group, and not as the beginning of any type of competition!
> >
> >
> > belvin
> >
> >
> >
> > --- In smf_addin@yahoogroups.com, "metastkuser100" <metastkuser100@> wrote:
> > >
> > > I would like to:
> > >
> > > a) Screen scrape two numbers from a website, every 15 minutes, which then populate a table in Excel, throughout market hours, M-F.
> > >
> > > b) Then plot an Excel chart based on the data in the table. As the day progresses, the table and the chart gets automatically updated every 15 minutes.
> > >
> > > This is possibly way beyond my abilities, but I'm willing to give a real good try!
> > >
> > > Can the first part be done by SMF plugin?
> > >
> >
>

Sun Oct 7, 2012 11:27 am (PDT) . Posted by:

"metastkuser100" metastkuser100

I'd be happy to pay a few bucks to a Python/Ruby/VBA/whatever programmer to put
this spreadsheet together for me.

Very simple specs: scrape data off website automatically every 15 minutes, store
in spreadsheet as static values, display data as an Excel line chart. 26 data
points per day, for many days. Bells/whistles like alerts, number crunching are
afterthoughts. andysmith_999 at hotmail dot com

--- In smf_addin@yahoogroups.com, "freebelvin" <freebelvin@...> wrote:
>
> Greetings to all via my first post to this group.
>
> I recently joined this group because you all do some things I wanted to learn. I've been studying the literature on the site and I'm very impressed with the level of technology, and wish to express my thanks to Randy and others for developing these tools and generously sharing them with others.
>
> It turns out that I'm an engineer and have a few tricks too. I thought I might take this opportunity to describe what I've been doing, and the question in this post is closely related to my approach. First let me say that I do not in any way put this out as any form of technology competition! I only bring this up as a way of expanding people's awareness of what is possible.
>
> What I have done is to use the same URL linking and fetching mechanisms embedded inside the excel macros, but I have embedded them instead into python scripts. This leads to much more programming effort, but also provides much more flexibility in handling the data. In the example below, a script could be coded with the website URL and fields to extract and fetch the data on a schedule. The data could then be printed to the screen, or additional algorithms could be attached to generate alerts, perform follow-on calculations, etc. The possibilities are broad, but I admit, this does require a level of software skill. And I will comment that as a C programmer, learning python syntax has been quite an adventure for me.
>
> OK, that's it. If anyone wishes to know more about this approach to processing web-based market data, please let me know and I can follow up. Again, please consider this to be an augmentation of the technology already existing in this group, and not as the beginning of any type of competition!
>
>
> belvin
>
>
>
> --- In smf_addin@yahoogroups.com, "metastkuser100" <metastkuser100@> wrote:
> >
> > I would like to:
> >
> > a) Screen scrape two numbers from a website, every 15 minutes, which then populate a table in Excel, throughout market hours, M-F.
> >
> > b) Then plot an Excel chart based on the data in the table. As the day progresses, the table and the chart gets automatically updated every 15 minutes.
> >
> > This is possibly way beyond my abilities, but I'm willing to give a real good try!
> >
> > Can the first part be done by SMF plugin?
> >
>

Sun Oct 7, 2012 11:59 am (PDT) . Posted by:

"dguillett1" donaldb36

I'm an excel vba programmer. Contact me with the details

Don Guillett
Microsoft Excel Developer
SalesAid Software
dguillett1@gmail.com

From: MS
Sent: Sunday, October 07, 2012 1:14 PM
To: smf_addin@yahoogroups.com
Subject: [smf_addin] Re: Screen scrape data to plot intraday chart

I'd be happy to pay a few bucks to a Python/Ruby/VBA/whatever programmer to put this spreadsheet together for me.

Very simple specs: scrape data off website automatically every 15 minutes, store in spreadsheet as static values, display data as an Excel line chart. 26 data points per day, for many days. Bells/whistles like alerts, number crunching are afterthoughts. mailto:andysmith999%40hotmail.com

--- In mailto:smf_addin%40yahoogroups.com, "freebelvin" <freebelvin@...> wrote:
>
> Greetings to all via my first post to this group.
>
> I recently joined this group because you all do some things I wanted to learn. I've been studying the literature on the site and I'm very impressed with the level of technology, and wish to express my thanks to Randy and others for developing these tools and generously sharing them with others.
>
> It turns out that I'm an engineer and have a few tricks too. I thought I might take this opportunity to describe what I've been doing, and the question in this post is closely related to my approach. First let me say that I do not in any way put this out as any form of technology competition! I only bring this up as a way of expanding people's awareness of what is possible.
>
> What I have done is to use the same URL linking and fetching mechanisms embedded inside the excel macros, but I have embedded them instead into python scripts. This leads to much more programming effort, but also provides much more flexibility in handling the data. In the example below, a script could be coded with the website URL and fields to extract and fetch the data on a schedule. The data could then be printed to the screen, or additional algorithms could be attached to generate alerts, perform follow-on calculations, etc. The possibilities are broad, but I admit, this does require a level of software skill. And I will comment that as a C programmer, learning python syntax has been quite an adventure for me.
>
> OK, that's it. If anyone wishes to know more about this approach to processing web-based market data, please let me know and I can follow up. Again, please consider this to be an augmentation of the technology already existing in this group, and not as the beginning of any type of competition!
>
>
> belvin
>
>
>
> --- In mailto:smf_addin%40yahoogroups.com, "metastkuser100" <metastkuser100@> wrote:
> >
> > I would like to:
> >
> > a) Screen scrape two numbers from a website, every 15 minutes, which then populate a table in Excel, throughout market hours, M-F.
> >
> > b) Then plot an Excel chart based on the data in the table. As the day progresses, the table and the chart gets automatically updated every 15 minutes.
> >
> > This is possibly way beyond my abilities, but I'm willing to give a real good try!
> >
> > Can the first part be done by SMF plugin?
> >
>

Sun Oct 7, 2012 11:46 am (PDT) . Posted by:

"smithjhhic" smithjhhic

dgoyal,
Thank you very much for pointing out how to obtain data for 16 days.

Randy,
Great to have you back!

All,
I am not very experienced and having difficulty implementing "bushpilote" suggestion. I am trying to return "14-day SBV" and "14-day Volume MA" for the most recent date. Do I need to bring back the entire table to obtain the data for those two cells for the most recent date? Is there a formula I can enter to obtain just those two cells for a symbol in C3? Any help would be appreciated.
v/r,
Jeff

--- In smf_addin@yahoogroups.com, "dgoyal" <dgoyal@...> wrote:
>
> You can get data for periods other than 15 days by adding "&p1=number of days" to your URL. For example, to get 16 days, you will use following URL.
>
> http://www.marketvolume.com/stocks/moneyflow.asp?s=AAPL&t=apple&p1=16
>
> --- In smf_addin@yahoogroups.com, bushpilote@ wrote:
> >
> >
> > You are welcomed.
> > --- In smf_addin@yahoogroups.com, "smithjhhic" <smithjhhic@> wrote:
> > >
> > > Thank you very much for your help.
> > >
> > >
> > > --- In smf_addin@yahoogroups.com, "bushpilote" <bushpilote@> wrote:
> > > >
> > > > Hi,
> > > > try this for the 14-day data:
> > > >
> > > > =RCHGetTableCell($C$4,C$5,"<body",,,,$B13,,,"--")
> > > >
> > > > where $C$4 = http://www.marketvolume.com/stocks/moneyflow.asp?s=AAPL&t=apple
> > > >
> > > > C$5:J$5 = Column numbers starting with 1 ending with 8 in increments of 1 (there are 8 columns of data in the table)
> > > >
> > > > $B13:$B84 = row numbers starting from 8 ending with 79 in increments of one (there are 72 rows of data in the table including headings). One to seven (B6:B12) returns a combination of blanks, explanatory data and duplicate headings.
> > > >
> > > > Copy the formula in C5:B84.
> > > >
> > > > PS There's nothing sacred about starting in cells C4,C5 and B13.
> > > >
> > > > Can't help you with the "other than 14-day" question.
> > > >
> > > > Cheers.
> > > >
> > > > --- In smf_addin@yahoogroups.com, "smithjhhic" <smithjhhic@> wrote:
> > > > >
> > > > > Hello,
> > > > > I am trying to return "14-day SBV" and "14-day Volume MA" for the latest date (see site link below). There were some emails post relating to this site from 2009, but I was unable to apply those to my request. Also, the site allows the user to change the bar period from 14 day to another value (ie. 16 day). Is there away to use something other than 14 day with the addin? I appreciate any help you could provide.
> > > > >
> > > > > http://www.marketvolume.com/stocks/moneyflow.asp?s=AAPL&t=apple
> > > > >
> > > >
> > >
> >
>

Sun Oct 7, 2012 5:29 pm (PDT) . Posted by:

"Randy Harmelink" rharmelink

Is this all you want:

=RCHGetTableCell("http://www.marketvolume.com/stocks/moneyflow.asp?s=AAPL
",8,">Change",,,,1)
=RCHGetTableCell("http://www.marketvolume.com/stocks/moneyflow.asp?s=AAPL
",5,">Change",,,,1)

On Sun, Oct 7, 2012 at 11:46 AM, smithjhhic <smithjhhic@yahoo.com> wrote:

> I am not very experienced and having difficulty implementing "bushpilote"
> suggestion. I am trying to return "14-day SBV" and "14-day Volume MA" for
> the most recent date. Do I need to bring back the entire table to obtain
> the data for those two cells for the most recent date? Is there a formula I
> can enter to obtain just those two cells for a symbol in C3? Any help would
> be appreciated.
>

Sun Oct 7, 2012 3:14 pm (PDT) . Posted by:

"JCHyjun" JCHyjun

Hello,

We know that companies cook books, so some data we use are not reliable. There are at least 2 methods to uncover companies who are probably manipulating their reported earnings: Beneish M Score and Montier C Score ( similar to Piotroski & Altman scores).

Does anybody tried to use SMF for M Score or for C Score ?

J.C. Hyjun

Sun Oct 7, 2012 5:23 pm (PDT) . Posted by:

"Randy Harmelink" rharmelink

Can you give me two example formulas, where the first one works fine and
the second one returns blanks? Preferably from reuters.com.

Also, which version of the add-in are you running?

On Tue, Sep 25, 2012 at 6:20 PM, dlx_0lb <dlx_0lb@yahoo.com> wrote:

>
> I signed up to reuters.com and market.ft.com and , I log into reuters.comand
> market.ft.com via Excel Web Query, and start using RCHGetHTMLTable
> function.
> It worked fine for the first ticker, but return blanks for the rest.
> Delete all of cookies, history, temporary files etc, relog, but it worked
> for another first ticker.
> Any idea ?
>

Sun Oct 7, 2012 5:56 pm (PDT) . Posted by:

"Randy Harmelink" rharmelink

What are you trying to retrieve? To get the entire table, or the Yield from
the row for the first listed item:

=RCHGetHTMLTable("
https://fixedincome.fidelity.com/ftgw/fi/FIIndividualBondsSearch?cusip=404280AE9
",
"Found: ",1,"",1)

=RCHGetTableCell("
https://fixedincome.fidelity.com/ftgw/fi/FIIndividualBondsSearch?cusip=404280AE9
",1,"FIBondDetails?")

On Mon, Sep 17, 2012 at 1:31 PM, Steven <stevenletzer@yahoo.com> wrote:

> Fidelitr's public website with public bond information is quite simple to
> use. the address is
> https://fixedincome.fidelity.com/ftgw/fi/FIIndividualBondsSearch?cusip=For the CUSIP the address is
> https://fixedincome.fidelity.com/ftgw/fi/FIIndividualBondsSearch?cusip=404280AE9.
> The headers returned appear to be in table format, but I an having
> difficulty capturing the data. The table headers are:
>
> Description, Coupon, Maturity Date, Moody's S&P Bid, Ask and so on. The
> first column of the table is Description. I have not found a word to amchor
> to. It would be easier if the CUSIP was the first (or any column).
>
> Any ideas?
>

Tidak ada komentar:

Posting Komentar